Apr 17 18:10:03.109554 ip-10-0-134-83 systemd[1]: Starting Kubernetes Kubelet... Apr 17 18:10:03.515167 ip-10-0-134-83 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:10:03.515167 ip-10-0-134-83 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 18:10:03.515167 ip-10-0-134-83 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:10:03.515167 ip-10-0-134-83 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 18:10:03.515167 ip-10-0-134-83 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 18:10:03.515955 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.515856 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 18:10:03.520144 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520113 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:03.520144 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520141 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:03.520144 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520148 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:03.520144 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520152 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520156 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520159 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520162 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520165 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520167 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520170 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520172 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520175 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520191 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520194 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520197 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520200 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520202 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520205 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520209 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520211 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520214 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520216 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520219 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:03.520358 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520222 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520225 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520227 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520230 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520232 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520235 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520238 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520242 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520246 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520248 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520252 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520254 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520257 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520259 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520265 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520270 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520273 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520276 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520279 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520282 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:03.520836 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520285 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520288 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520290 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520293 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520296 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520299 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520301 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520304 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520307 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520309 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520312 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520314 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520316 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520319 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520321 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520324 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520326 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520329 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520332 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520335 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:03.521398 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520337 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520340 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520343 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520345 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520348 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520351 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520354 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520356 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520359 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520362 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520364 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520368 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520372 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520375 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520377 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520388 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520391 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520393 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520396 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520399 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:03.521874 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520402 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520404 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520407 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520857 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520863 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520866 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520869 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520871 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520874 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520878 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520882 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520885 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520887 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520890 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520893 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520895 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520898 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520900 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520903 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:03.522379 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520906 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520908 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520911 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520914 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520916 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520918 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520921 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520924 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520934 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520937 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520939 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520942 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520944 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520946 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520949 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520952 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520954 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520957 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520960 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520962 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:03.522841 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520964 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520967 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520971 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520975 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520979 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520982 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520985 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520988 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520990 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520993 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520995 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.520999 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521002 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521004 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521007 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521009 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521012 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521014 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521017 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521020 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:03.523370 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521022 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521031 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521034 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521036 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521039 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521041 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521044 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521046 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521049 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521052 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521054 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521057 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521059 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521061 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521063 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521068 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521071 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521073 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521075 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521078 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:03.523898 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521081 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521083 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521085 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521089 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521091 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521094 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521096 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521099 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521102 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.521104 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521771 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521784 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521791 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521796 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521812 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521815 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521821 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521825 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521828 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521831 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521835 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 18:10:03.524408 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521838 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521841 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521844 2576 flags.go:64] FLAG: --cgroup-root="" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521847 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521850 2576 flags.go:64] FLAG: --client-ca-file="" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521853 2576 flags.go:64] FLAG: --cloud-config="" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521856 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521859 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521866 2576 flags.go:64] FLAG: --cluster-domain="" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521869 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521872 2576 flags.go:64] FLAG: --config-dir="" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521875 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521878 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521883 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521886 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521890 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521893 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521897 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521899 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521902 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521905 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521908 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521913 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521916 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521919 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 18:10:03.524926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521922 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521931 2576 flags.go:64] FLAG: --enable-server="true" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521934 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521941 2576 flags.go:64] FLAG: --event-burst="100" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521945 2576 flags.go:64] FLAG: --event-qps="50" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521948 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521951 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521954 2576 flags.go:64] FLAG: --eviction-hard="" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521958 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521961 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521964 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521967 2576 flags.go:64] FLAG: --eviction-soft="" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521970 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521973 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521977 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521979 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521982 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521985 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521988 2576 flags.go:64] FLAG: --feature-gates="" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521992 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521995 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.521999 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522003 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522006 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522009 2576 flags.go:64] FLAG: --help="false" Apr 17 18:10:03.525572 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522012 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522015 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522018 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522021 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522025 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522028 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522031 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522034 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522037 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522046 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522049 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522052 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522055 2576 flags.go:64] FLAG: --kube-reserved="" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522059 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522061 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522064 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522067 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522070 2576 flags.go:64] FLAG: --lock-file="" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522074 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522077 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522080 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522087 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522090 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522093 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 18:10:03.526173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522095 2576 flags.go:64] FLAG: --logging-format="text" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522098 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522102 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522104 2576 flags.go:64] FLAG: --manifest-url="" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522108 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522113 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522117 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522121 2576 flags.go:64] FLAG: --max-pods="110" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522124 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522127 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522130 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522133 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522136 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522139 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522141 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522151 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522154 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522156 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522166 2576 flags.go:64] FLAG: --pod-cidr="" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522169 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522175 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522190 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522193 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522196 2576 flags.go:64] FLAG: --port="10250" Apr 17 18:10:03.526806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522200 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522203 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-034f43e58c372055c" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522206 2576 flags.go:64] FLAG: --qos-reserved="" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522210 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522213 2576 flags.go:64] FLAG: --register-node="true" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522216 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522219 2576 flags.go:64] FLAG: --register-with-taints="" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522223 2576 flags.go:64] FLAG: --registry-burst="10" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522225 2576 flags.go:64] FLAG: --registry-qps="5" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522228 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522231 2576 flags.go:64] FLAG: --reserved-memory="" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522235 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522238 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522242 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522245 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522248 2576 flags.go:64] FLAG: --runonce="false" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522251 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522254 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522257 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522262 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522265 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522268 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522272 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522275 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522278 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522281 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 18:10:03.527411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522284 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522293 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522297 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522300 2576 flags.go:64] FLAG: --system-cgroups="" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522302 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522308 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522311 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522314 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522323 2576 flags.go:64] FLAG: --tls-min-version="" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522326 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522331 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522334 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522337 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522340 2576 flags.go:64] FLAG: --v="2" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522344 2576 flags.go:64] FLAG: --version="false" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522348 2576 flags.go:64] FLAG: --vmodule="" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522353 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.522356 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522465 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522469 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522472 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522476 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522480 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:03.528030 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522482 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522485 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522488 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522491 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522493 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522496 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522498 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522501 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522503 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522506 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522508 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522518 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522521 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522523 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522525 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522528 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522530 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522533 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522535 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:03.528610 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522540 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522542 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522544 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522547 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522549 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522552 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522555 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522557 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522559 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522562 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522564 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522567 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522569 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522572 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522574 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522577 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522580 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522582 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522585 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522587 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:03.529142 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522590 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522593 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522596 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522598 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522602 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522611 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522614 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522617 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522620 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522622 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522625 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522627 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522631 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522634 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522636 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522639 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522641 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522644 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522646 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:03.529648 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522648 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522651 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522654 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522656 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522659 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522661 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522664 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522666 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522669 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522672 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522674 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522677 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522679 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522682 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522684 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522687 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522689 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522691 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522694 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522702 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:03.530131 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522705 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:03.530641 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522707 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:03.530641 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.522710 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:03.530641 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.523456 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:10:03.530641 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.530511 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 18:10:03.530641 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.530537 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 18:10:03.531313 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531298 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:03.531313 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531311 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:03.531313 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531315 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531318 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531322 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531325 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531328 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531332 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531335 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531338 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531341 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531345 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531350 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531353 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531356 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531359 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531362 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531365 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531367 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531370 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531373 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531376 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:03.531408 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531378 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531381 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531383 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531386 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531389 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531391 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531394 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531397 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531399 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531404 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531407 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531410 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531413 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531416 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531419 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531424 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531428 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531431 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531434 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:03.532003 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531437 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531440 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531443 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531446 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531449 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531451 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531454 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531456 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531459 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531462 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531465 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531467 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531469 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531472 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531475 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531478 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531481 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531483 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531486 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:03.532501 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531488 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531491 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531493 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531496 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531499 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531502 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531505 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531508 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531510 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531513 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531516 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531518 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531521 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531523 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531526 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531528 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531530 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531533 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531535 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531538 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:03.532966 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531540 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531542 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531545 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531547 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531550 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531553 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.531558 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531669 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531676 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531679 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531682 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531685 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531688 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531691 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531693 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 18:10:03.533477 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531696 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531698 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531702 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531704 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531708 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531712 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531714 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531717 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531720 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531722 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531725 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531727 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531730 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531732 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531735 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531737 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531739 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531742 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531744 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 18:10:03.533900 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531747 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531749 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531769 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531773 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531776 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531780 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531785 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531788 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531791 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531794 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531797 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531800 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531802 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531805 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531808 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531811 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531814 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531817 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531820 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 18:10:03.534392 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531822 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531825 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531827 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531829 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531832 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531836 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531840 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531844 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531847 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531849 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531852 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531855 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531858 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531861 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531863 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531866 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531868 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531871 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531874 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531877 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 18:10:03.534869 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531880 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531883 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531886 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531888 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531891 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531894 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531897 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531900 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531903 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531905 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531908 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531911 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531913 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531916 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531918 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531921 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531923 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531925 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531928 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 18:10:03.535399 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:03.531931 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 18:10:03.535869 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.531936 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 18:10:03.535869 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.532080 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 18:10:03.535869 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.534455 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 18:10:03.535869 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.535341 2576 server.go:1019] "Starting client certificate rotation" Apr 17 18:10:03.535869 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.535458 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:10:03.535869 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.535505 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 18:10:03.558557 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.558524 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:10:03.561220 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.561140 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 18:10:03.577778 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.577740 2576 log.go:25] "Validated CRI v1 runtime API" Apr 17 18:10:03.585067 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.585044 2576 log.go:25] "Validated CRI v1 image API" Apr 17 18:10:03.586488 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.586451 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 18:10:03.590489 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.590458 2576 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9cc77f07-2a14-4548-a39a-e626a0cbf469:/dev/nvme0n1p4 bc118278-fb47-461d-a658-70977cf10ff8:/dev/nvme0n1p3] Apr 17 18:10:03.590559 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.590488 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 18:10:03.591814 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.591795 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:10:03.595846 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.595700 2576 manager.go:217] Machine: {Timestamp:2026-04-17 18:10:03.594537845 +0000 UTC m=+0.367012888 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:2499996 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d2a3f47d16da1a60e222f9e43b08f SystemUUID:ec2d2a3f-47d1-6da1-a60e-222f9e43b08f BootID:96928a47-d8af-4b4e-a6ba-238fe3a45efc Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2b:91:02:79:a1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2b:91:02:79:a1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:56:b1:bc:d1:17:78 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 18:10:03.595846 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.595833 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 18:10:03.595983 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.595935 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 18:10:03.597165 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.597129 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 18:10:03.597340 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.597167 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-134-83.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 18:10:03.597387 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.597352 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 18:10:03.597387 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.597360 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 18:10:03.597387 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.597374 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:10:03.597470 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.597394 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 18:10:03.598868 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.598854 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:10:03.599022 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.599012 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 18:10:03.601221 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.601205 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 17 18:10:03.601277 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.601232 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 18:10:03.601898 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.601886 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 18:10:03.601931 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.601906 2576 kubelet.go:397] "Adding apiserver pod source" Apr 17 18:10:03.601931 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.601922 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 18:10:03.603077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.603061 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:10:03.603135 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.603082 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 18:10:03.606174 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.606154 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 18:10:03.607717 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.607701 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 18:10:03.609489 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609429 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 18:10:03.609489 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609448 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 18:10:03.609489 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609455 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 18:10:03.609489 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609460 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 18:10:03.609489 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609466 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 18:10:03.609696 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609501 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 18:10:03.609696 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609511 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 18:10:03.609696 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609519 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 18:10:03.609696 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609530 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 18:10:03.609696 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609540 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 18:10:03.609696 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609565 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 18:10:03.609696 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.609579 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 18:10:03.610510 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.610498 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 18:10:03.610550 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.610513 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 18:10:03.614578 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.614558 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 18:10:03.614695 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.614606 2576 server.go:1295] "Started kubelet" Apr 17 18:10:03.614750 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.614691 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 18:10:03.615261 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.615198 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 18:10:03.615359 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.615293 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 18:10:03.615715 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.615697 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-134-83.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 18:10:03.615715 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.615708 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-134-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 18:10:03.615709 ip-10-0-134-83 systemd[1]: Started Kubernetes Kubelet. Apr 17 18:10:03.615954 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.615874 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 18:10:03.617526 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.617509 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 17 18:10:03.619354 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.619325 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 18:10:03.623568 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.623542 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 18:10:03.627057 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.627028 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 18:10:03.627227 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.627036 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 18:10:03.627632 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.627610 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 18:10:03.627632 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.627633 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 18:10:03.627820 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.627808 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 18:10:03.627880 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.627847 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:03.627933 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.627888 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 17 18:10:03.627933 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.624684 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-134-83.ec2.internal.18a737520fe6b2f5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-134-83.ec2.internal,UID:ip-10-0-134-83.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-134-83.ec2.internal,},FirstTimestamp:2026-04-17 18:10:03.614573301 +0000 UTC m=+0.387048344,LastTimestamp:2026-04-17 18:10:03.614573301 +0000 UTC m=+0.387048344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-134-83.ec2.internal,}" Apr 17 18:10:03.627933 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.627897 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 17 18:10:03.628618 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.628597 2576 factory.go:55] Registering systemd factory Apr 17 18:10:03.628726 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.628666 2576 factory.go:223] Registration of the systemd container factory successfully Apr 17 18:10:03.628941 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.628928 2576 factory.go:153] Registering CRI-O factory Apr 17 18:10:03.629001 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.628944 2576 factory.go:223] Registration of the crio container factory successfully Apr 17 18:10:03.629034 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.629011 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 18:10:03.629068 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.629033 2576 factory.go:103] Registering Raw factory Apr 17 18:10:03.629068 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.629049 2576 manager.go:1196] Started watching for new ooms in manager Apr 17 18:10:03.629554 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.629542 2576 manager.go:319] Starting recovery of all containers Apr 17 18:10:03.629957 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.629928 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 18:10:03.630079 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.629976 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-134-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 18:10:03.632519 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.632302 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r9xgd" Apr 17 18:10:03.637115 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.637067 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 18:10:03.639514 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.639487 2576 manager.go:324] Recovery completed Apr 17 18:10:03.639710 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.639686 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-r9xgd" Apr 17 18:10:03.645910 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.645887 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:03.649322 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.649295 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:03.649448 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.649345 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:03.649448 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.649357 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:03.649922 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.649903 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 18:10:03.649922 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.649917 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 18:10:03.650047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.649939 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 17 18:10:03.653516 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.653498 2576 policy_none.go:49] "None policy: Start" Apr 17 18:10:03.653591 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.653534 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 18:10:03.653591 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.653548 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.691017 2576 manager.go:341] "Starting Device Plugin manager" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.691051 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.691061 2576 server.go:85] "Starting device plugin registration server" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.691431 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.691444 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.691631 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.692209 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.692241 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.692282 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 18:10:03.713623 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.692321 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:03.786776 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.786687 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 18:10:03.786776 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.786727 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 18:10:03.786776 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.786754 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 18:10:03.786776 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.786760 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 18:10:03.787043 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.786801 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 18:10:03.789470 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.789445 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:03.792469 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.792447 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:03.794983 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.794958 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:03.795112 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.794998 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:03.795112 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.795014 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:03.795112 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.795042 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.804393 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.804367 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.804557 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.804400 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-134-83.ec2.internal\": node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:03.819452 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.819417 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:03.887137 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.887077 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal"] Apr 17 18:10:03.887254 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.887207 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:03.889192 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.889159 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:03.889286 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.889208 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:03.889286 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.889219 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:03.891647 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.891632 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:03.891812 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.891794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.891859 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.891840 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:03.892523 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.892506 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:03.892614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.892510 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:03.892614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.892557 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:03.892614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.892569 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:03.892614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.892535 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:03.892614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.892607 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:03.897479 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.897457 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.897552 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.897506 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 18:10:03.898439 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.898421 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientMemory" Apr 17 18:10:03.898543 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.898453 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 18:10:03.898543 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.898465 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeHasSufficientPID" Apr 17 18:10:03.908944 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.908914 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-83.ec2.internal\" not found" node="ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.915200 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.915158 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-134-83.ec2.internal\" not found" node="ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.920337 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:03.920312 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:03.930756 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.930718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b6dc9341d6e3a6c508cdf128e57395dc-config\") pod \"kube-apiserver-proxy-ip-10-0-134-83.ec2.internal\" (UID: \"b6dc9341d6e3a6c508cdf128e57395dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.930884 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.930765 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/85c4607f0afe2df9f8ce864e9d7b1bd7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal\" (UID: \"85c4607f0afe2df9f8ce864e9d7b1bd7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:03.930884 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:03.930821 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85c4607f0afe2df9f8ce864e9d7b1bd7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal\" (UID: \"85c4607f0afe2df9f8ce864e9d7b1bd7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.021483 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.021442 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:04.031917 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.031892 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85c4607f0afe2df9f8ce864e9d7b1bd7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal\" (UID: \"85c4607f0afe2df9f8ce864e9d7b1bd7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.032050 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.031926 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b6dc9341d6e3a6c508cdf128e57395dc-config\") pod \"kube-apiserver-proxy-ip-10-0-134-83.ec2.internal\" (UID: \"b6dc9341d6e3a6c508cdf128e57395dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.032050 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.031944 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/85c4607f0afe2df9f8ce864e9d7b1bd7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal\" (UID: \"85c4607f0afe2df9f8ce864e9d7b1bd7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.032050 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.032001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/b6dc9341d6e3a6c508cdf128e57395dc-config\") pod \"kube-apiserver-proxy-ip-10-0-134-83.ec2.internal\" (UID: \"b6dc9341d6e3a6c508cdf128e57395dc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.032050 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.032010 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/85c4607f0afe2df9f8ce864e9d7b1bd7-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal\" (UID: \"85c4607f0afe2df9f8ce864e9d7b1bd7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.032214 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.032079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/85c4607f0afe2df9f8ce864e9d7b1bd7-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal\" (UID: \"85c4607f0afe2df9f8ce864e9d7b1bd7\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.122410 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.122312 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:04.210989 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.210954 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.218588 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.218560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.223106 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.223083 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:04.323664 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.323609 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:04.424278 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.424164 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:04.524824 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.524766 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-134-83.ec2.internal\" not found" Apr 17 18:10:04.527252 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.527232 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:04.536331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.536298 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 18:10:04.536463 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.536446 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:10:04.536509 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.536487 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 18:10:04.553537 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.553505 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:04.602470 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.602429 2576 apiserver.go:52] "Watching apiserver" Apr 17 18:10:04.610856 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.610833 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 18:10:04.611254 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.611232 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fdxn5","openshift-image-registry/node-ca-cm9j8","openshift-multus/multus-67qdn","openshift-multus/multus-additional-cni-plugins-zhcv2","openshift-ovn-kubernetes/ovnkube-node-5w6tv","kube-system/konnectivity-agent-rpssj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl","openshift-cluster-node-tuning-operator/tuned-v86sj","openshift-multus/network-metrics-daemon-nmf4d","openshift-network-diagnostics/network-check-target-zm674","openshift-network-operator/iptables-alerter-7bztg"] Apr 17 18:10:04.616086 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.616056 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.618196 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.618149 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.619051 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.619029 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 18:10:04.619051 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.619034 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 18:10:04.619264 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.619064 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sk7kv\"" Apr 17 18:10:04.620346 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.620320 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.621966 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.621946 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 18:10:04.622070 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.621997 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-qd9lw\"" Apr 17 18:10:04.622341 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.622319 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 18:10:04.622474 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.622408 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 18:10:04.622543 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.622528 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.622653 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.622635 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:04.623355 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.623337 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d2h25\"" Apr 17 18:10:04.623491 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.623475 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 18:10:04.623654 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.623639 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 18:10:04.623711 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.623669 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 18:10:04.623786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.623768 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 18:10:04.625121 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.625102 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 18:10:04.625225 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.625140 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:04.625285 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.625221 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:04.625438 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.625415 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 18:10:04.626095 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.626072 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 18:10:04.626652 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.626630 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 18:10:04.628152 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.627171 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zm7fs\"" Apr 17 18:10:04.628152 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.627299 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 18:10:04.628152 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.627427 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-k5lpw\"" Apr 17 18:10:04.628152 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.628121 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.628638 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.628614 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:04.628762 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.628737 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:04.631643 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.631623 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.633960 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.633931 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-lnzgm\"" Apr 17 18:10:04.634092 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.634064 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 18:10:04.634171 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.634154 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.634260 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.634168 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 18:10:04.634311 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.634154 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 18:10:04.634669 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.634656 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 18:10:04.634864 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.634845 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 18:10:04.635226 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635208 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 18:10:04.635337 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.635402 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65xrg\" (UniqueName: \"kubernetes.io/projected/8ff23a89-3da2-420c-a6f3-bf94173e14c7-kube-api-access-65xrg\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.635402 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635386 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-cni-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.635498 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635405 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-socket-dir-parent\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.635498 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c650676f-6713-458e-b337-21b94770e9f5-tmp-dir\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.635498 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-os-release\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635533 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635571 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-system-cni-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635594 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-netns\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635609 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-os-release\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-etc-kubernetes\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635669 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxzq4\" (UniqueName: \"kubernetes.io/projected/3728df74-67ba-42d4-88b0-a83eca2d9e0f-kube-api-access-dxzq4\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635686 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0473a77b-a212-4e59-806b-bcef01945958-konnectivity-ca\") pod \"konnectivity-agent-rpssj\" (UID: \"0473a77b-a212-4e59-806b-bcef01945958\") " pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:04.635706 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c650676f-6713-458e-b337-21b94770e9f5-hosts-file\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2rvg\" (UniqueName: \"kubernetes.io/projected/1d2787f0-024b-4888-980e-e458a856a250-kube-api-access-p2rvg\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635769 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-kubelet\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635799 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-hostroot\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635825 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-daemon-config\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635848 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-host\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-cnibin\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-k8s-cni-cncf-io\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635951 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3728df74-67ba-42d4-88b0-a83eca2d9e0f-cni-binary-copy\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.635970 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mst82\" (UniqueName: \"kubernetes.io/projected/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-kube-api-access-mst82\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636015 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cnibin\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-cni-bin\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636051 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-cni-multus\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636077 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636077 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-multus-certs\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636114 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-serviceca\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-system-cni-dir\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636200 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snh2h\" (UniqueName: \"kubernetes.io/projected/c650676f-6713-458e-b337-21b94770e9f5-kube-api-access-snh2h\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636233 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636280 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-conf-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636334 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0473a77b-a212-4e59-806b-bcef01945958-agent-certs\") pod \"konnectivity-agent-rpssj\" (UID: \"0473a77b-a212-4e59-806b-bcef01945958\") " pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636500 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.636729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636553 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 18:10:04.637276 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.636987 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 18:10:04.637276 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.637007 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-kjkrk\"" Apr 17 18:10:04.638061 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.638041 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:10:04.638131 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.638113 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" Apr 17 18:10:04.638752 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.638719 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal"] Apr 17 18:10:04.638856 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.638807 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-7jk2w\"" Apr 17 18:10:04.638856 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.638833 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.640347 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.639029 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 18:10:04.640347 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.639054 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:10:04.640729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.640695 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 18:10:04.640818 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.640761 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:10:04.641175 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.641155 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 18:10:04.641305 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.641261 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 18:10:04.641305 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.641286 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-j7rnz\"" Apr 17 18:10:04.642268 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.642236 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 18:05:03 +0000 UTC" deadline="2028-01-27 17:47:09.751652106 +0000 UTC" Apr 17 18:10:04.642268 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.642266 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15599h37m5.10938811s" Apr 17 18:10:04.648717 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.648696 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 18:10:04.649109 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.649088 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal"] Apr 17 18:10:04.728614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.728585 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 18:10:04.737367 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737340 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-hostroot\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.737367 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-daemon-config\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-var-lib-kubelet\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737414 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-cnibin\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-k8s-cni-cncf-io\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-hostroot\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737462 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-device-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737489 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-k8s-cni-cncf-io\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737513 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-cnibin\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737511 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prwv5\" (UniqueName: \"kubernetes.io/projected/1416e0f9-d096-4822-907d-583c7506c232-kube-api-access-prwv5\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.737579 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737570 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-var-lib-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737612 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-cni-bin\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737638 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-kubernetes\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737663 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mst82\" (UniqueName: \"kubernetes.io/projected/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-kube-api-access-mst82\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cnibin\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-cni-bin\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737718 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-multus-certs\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737743 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-sys-fs\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737752 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cnibin\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737760 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-cni-bin\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737772 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqpqp\" (UniqueName: \"kubernetes.io/projected/7949a229-9d6d-445d-938c-4147dc073aaf-kube-api-access-nqpqp\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-multus-certs\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-serviceca\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-system-cni-dir\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-etc-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.738033 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737967 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-system-cni-dir\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-run\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.737994 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-daemon-config\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738047 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-conf-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738095 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0473a77b-a212-4e59-806b-bcef01945958-agent-certs\") pod \"konnectivity-agent-rpssj\" (UID: \"0473a77b-a212-4e59-806b-bcef01945958\") " pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738114 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-conf-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738122 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovn-node-metrics-cert\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysctl-conf\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738197 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-run-netns\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c650676f-6713-458e-b337-21b94770e9f5-tmp-dir\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738352 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-serviceca\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-netns\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-socket-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738517 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 18:10:04.738886 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738542 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4wh4\" (UniqueName: \"kubernetes.io/projected/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-kube-api-access-x4wh4\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738594 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dxzq4\" (UniqueName: \"kubernetes.io/projected/3728df74-67ba-42d4-88b0-a83eca2d9e0f-kube-api-access-dxzq4\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738605 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c650676f-6713-458e-b337-21b94770e9f5-tmp-dir\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738632 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738662 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-run-netns\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0473a77b-a212-4e59-806b-bcef01945958-konnectivity-ca\") pod \"konnectivity-agent-rpssj\" (UID: \"0473a77b-a212-4e59-806b-bcef01945958\") " pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-kubelet\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738843 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-systemd-units\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738866 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-kubelet\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-host\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738950 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738953 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-host\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.738974 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-log-socket\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-modprobe-d\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739093 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3728df74-67ba-42d4-88b0-a83eca2d9e0f-cni-binary-copy\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-cni-multus\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.739761 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-kubelet\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739156 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-cni-netd\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739160 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0473a77b-a212-4e59-806b-bcef01945958-konnectivity-ca\") pod \"konnectivity-agent-rpssj\" (UID: \"0473a77b-a212-4e59-806b-bcef01945958\") " pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739204 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-lib-modules\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739222 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-host-var-lib-cni-multus\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739229 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpt65\" (UniqueName: \"kubernetes.io/projected/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-kube-api-access-fpt65\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739263 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-slash\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739290 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-run-ovn-kubernetes\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739317 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-iptables-alerter-script\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739346 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snh2h\" (UniqueName: \"kubernetes.io/projected/c650676f-6713-458e-b337-21b94770e9f5-kube-api-access-snh2h\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739377 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-registration-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739432 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-ovn\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.739476 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739444 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.739568 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:05.239533401 +0000 UTC m=+2.012008456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:04.740542 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3728df74-67ba-42d4-88b0-a83eca2d9e0f-cni-binary-copy\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739539 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysctl-d\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739634 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-systemd\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65xrg\" (UniqueName: \"kubernetes.io/projected/8ff23a89-3da2-420c-a6f3-bf94173e14c7-kube-api-access-65xrg\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-systemd\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739705 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-node-log\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739737 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovnkube-script-lib\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysconfig\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739789 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-cni-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739814 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-socket-dir-parent\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739841 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-os-release\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739864 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-system-cni-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovnkube-config\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739915 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-env-overrides\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.739949 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-os-release\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-os-release\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8ff23a89-3da2-420c-a6f3-bf94173e14c7-os-release\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-system-cni-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-cni-dir\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-etc-kubernetes\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740137 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-multus-socket-dir-parent\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740143 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-etc-selinux\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740168 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-sys\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3728df74-67ba-42d4-88b0-a83eca2d9e0f-etc-kubernetes\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-host\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7949a229-9d6d-445d-938c-4147dc073aaf-etc-tuned\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740294 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7949a229-9d6d-445d-938c-4147dc073aaf-tmp\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-host-slash\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c650676f-6713-458e-b337-21b94770e9f5-hosts-file\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740374 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p2rvg\" (UniqueName: \"kubernetes.io/projected/1d2787f0-024b-4888-980e-e458a856a250-kube-api-access-p2rvg\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740485 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c650676f-6713-458e-b337-21b94770e9f5-hosts-file\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.741786 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.740790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8ff23a89-3da2-420c-a6f3-bf94173e14c7-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.742271 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.741832 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0473a77b-a212-4e59-806b-bcef01945958-agent-certs\") pod \"konnectivity-agent-rpssj\" (UID: \"0473a77b-a212-4e59-806b-bcef01945958\") " pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:04.743576 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.743549 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:04.743576 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.743577 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:04.743758 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.743594 2576 projected.go:194] Error preparing data for projected volume kube-api-access-fg4vc for pod openshift-network-diagnostics/network-check-target-zm674: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:04.743758 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:04.743662 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc podName:c4c3a5fb-6090-40b4-b79d-305cd89dd057 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:05.243643719 +0000 UTC m=+2.016118755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fg4vc" (UniqueName: "kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc") pod "network-check-target-zm674" (UID: "c4c3a5fb-6090-40b4-b79d-305cd89dd057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:04.746116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.746091 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxzq4\" (UniqueName: \"kubernetes.io/projected/3728df74-67ba-42d4-88b0-a83eca2d9e0f-kube-api-access-dxzq4\") pod \"multus-67qdn\" (UID: \"3728df74-67ba-42d4-88b0-a83eca2d9e0f\") " pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.746302 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.746284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mst82\" (UniqueName: \"kubernetes.io/projected/d2f215a7-d8e6-4b38-bd88-ad6bf1f07470-kube-api-access-mst82\") pod \"node-ca-cm9j8\" (UID: \"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470\") " pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.746342 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.746286 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-rkxqh" Apr 17 18:10:04.751111 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.751081 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snh2h\" (UniqueName: \"kubernetes.io/projected/c650676f-6713-458e-b337-21b94770e9f5-kube-api-access-snh2h\") pod \"node-resolver-fdxn5\" (UID: \"c650676f-6713-458e-b337-21b94770e9f5\") " pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.751278 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.751258 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65xrg\" (UniqueName: \"kubernetes.io/projected/8ff23a89-3da2-420c-a6f3-bf94173e14c7-kube-api-access-65xrg\") pod \"multus-additional-cni-plugins-zhcv2\" (UID: \"8ff23a89-3da2-420c-a6f3-bf94173e14c7\") " pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.752229 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:04.752172 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85c4607f0afe2df9f8ce864e9d7b1bd7.slice/crio-560efd8aa27ca6828ddac792cb14e6a6607168ac64c61e9f888b3274c8af0641 WatchSource:0}: Error finding container 560efd8aa27ca6828ddac792cb14e6a6607168ac64c61e9f888b3274c8af0641: Status 404 returned error can't find the container with id 560efd8aa27ca6828ddac792cb14e6a6607168ac64c61e9f888b3274c8af0641 Apr 17 18:10:04.753092 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:04.753070 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6dc9341d6e3a6c508cdf128e57395dc.slice/crio-0586dcf3fbbbafebf132d381303b482daf9ca34acef9ef2e6240696c457dc722 WatchSource:0}: Error finding container 0586dcf3fbbbafebf132d381303b482daf9ca34acef9ef2e6240696c457dc722: Status 404 returned error can't find the container with id 0586dcf3fbbbafebf132d381303b482daf9ca34acef9ef2e6240696c457dc722 Apr 17 18:10:04.755222 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.755198 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-rkxqh" Apr 17 18:10:04.756559 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.756538 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2rvg\" (UniqueName: \"kubernetes.io/projected/1d2787f0-024b-4888-980e-e458a856a250-kube-api-access-p2rvg\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:04.758256 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.758242 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:10:04.789490 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.789426 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" event={"ID":"b6dc9341d6e3a6c508cdf128e57395dc","Type":"ContainerStarted","Data":"0586dcf3fbbbafebf132d381303b482daf9ca34acef9ef2e6240696c457dc722"} Apr 17 18:10:04.790272 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.790251 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" event={"ID":"85c4607f0afe2df9f8ce864e9d7b1bd7","Type":"ContainerStarted","Data":"560efd8aa27ca6828ddac792cb14e6a6607168ac64c61e9f888b3274c8af0641"} Apr 17 18:10:04.841649 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-lib-modules\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.841649 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841653 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpt65\" (UniqueName: \"kubernetes.io/projected/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-kube-api-access-fpt65\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841680 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-slash\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841703 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-run-ovn-kubernetes\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-iptables-alerter-script\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-slash\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-registration-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-ovn\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841802 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-lib-modules\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841815 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysctl-d\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841828 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-run-ovn-kubernetes\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.841875 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-registration-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841929 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-ovn\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.841958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysctl-d\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842005 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-systemd\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-systemd\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842055 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-node-log\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-systemd\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842080 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842089 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-systemd\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842092 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-node-log\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842107 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovnkube-script-lib\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysconfig\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysconfig\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovnkube-config\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842261 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-env-overrides\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842282 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-etc-selinux\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.842316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-sys\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-host\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7949a229-9d6d-445d-938c-4147dc073aaf-etc-tuned\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842375 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-etc-selinux\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-sys\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842393 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7949a229-9d6d-445d-938c-4147dc073aaf-tmp\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-host\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842418 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-host-slash\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842457 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-host-slash\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-var-lib-kubelet\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-device-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842521 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-prwv5\" (UniqueName: \"kubernetes.io/projected/1416e0f9-d096-4822-907d-583c7506c232-kube-api-access-prwv5\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-var-lib-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842548 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-iptables-alerter-script\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842564 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842591 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-run-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-cni-bin\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843116 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842596 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-device-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842628 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-kubernetes\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842669 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-kubernetes\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-sys-fs\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-cni-bin\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842710 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovnkube-script-lib\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842739 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqpqp\" (UniqueName: \"kubernetes.io/projected/7949a229-9d6d-445d-938c-4147dc073aaf-kube-api-access-nqpqp\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842754 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-sys-fs\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842758 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-var-lib-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842780 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-etc-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842840 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-run\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842858 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-etc-openvswitch\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842870 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovn-node-metrics-cert\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842869 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovnkube-config\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842863 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-env-overrides\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842912 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysctl-conf\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842917 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-var-lib-kubelet\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.843712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842922 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-run\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842947 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-run-netns\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.842969 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-socket-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x4wh4\" (UniqueName: \"kubernetes.io/projected/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-kube-api-access-x4wh4\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-run-netns\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843035 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-sysctl-conf\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843094 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-systemd-units\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843105 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-log-socket\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1416e0f9-d096-4822-907d-583c7506c232-socket-dir\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843170 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-modprobe-d\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843219 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-kubelet\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843223 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-systemd-units\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843228 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-log-socket\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843245 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-cni-netd\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-kubelet\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844246 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843311 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7949a229-9d6d-445d-938c-4147dc073aaf-etc-modprobe-d\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.844826 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.843324 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-host-cni-netd\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.844985 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.844965 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7949a229-9d6d-445d-938c-4147dc073aaf-tmp\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.845048 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.845006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7949a229-9d6d-445d-938c-4147dc073aaf-etc-tuned\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.845141 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.845123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-ovn-node-metrics-cert\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.850952 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.850926 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-prwv5\" (UniqueName: \"kubernetes.io/projected/1416e0f9-d096-4822-907d-583c7506c232-kube-api-access-prwv5\") pod \"aws-ebs-csi-driver-node-kp9nl\" (UID: \"1416e0f9-d096-4822-907d-583c7506c232\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:04.851137 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.851082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpt65\" (UniqueName: \"kubernetes.io/projected/a87beab3-c9c8-4166-b1ea-18aacdaa1b02-kube-api-access-fpt65\") pod \"iptables-alerter-7bztg\" (UID: \"a87beab3-c9c8-4166-b1ea-18aacdaa1b02\") " pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:04.851394 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.851374 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqpqp\" (UniqueName: \"kubernetes.io/projected/7949a229-9d6d-445d-938c-4147dc073aaf-kube-api-access-nqpqp\") pod \"tuned-v86sj\" (UID: \"7949a229-9d6d-445d-938c-4147dc073aaf\") " pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:04.851617 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.851601 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4wh4\" (UniqueName: \"kubernetes.io/projected/0c89c182-082a-4aa5-95e4-40fa3cd0c63d-kube-api-access-x4wh4\") pod \"ovnkube-node-5w6tv\" (UID: \"0c89c182-082a-4aa5-95e4-40fa3cd0c63d\") " pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:04.946809 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.946718 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fdxn5" Apr 17 18:10:04.953529 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:04.953500 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc650676f_6713_458e_b337_21b94770e9f5.slice/crio-136860f9bb25b97ccb591d34fe28970db11c7d43a8b75f5d9e45101fca9ccf8c WatchSource:0}: Error finding container 136860f9bb25b97ccb591d34fe28970db11c7d43a8b75f5d9e45101fca9ccf8c: Status 404 returned error can't find the container with id 136860f9bb25b97ccb591d34fe28970db11c7d43a8b75f5d9e45101fca9ccf8c Apr 17 18:10:04.961915 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.961892 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-cm9j8" Apr 17 18:10:04.969335 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:04.969301 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2f215a7_d8e6_4b38_bd88_ad6bf1f07470.slice/crio-b98a18bb0dc47b84c1492b2ac17a68adb27f2f41e39d059c6273ab986fbe141f WatchSource:0}: Error finding container b98a18bb0dc47b84c1492b2ac17a68adb27f2f41e39d059c6273ab986fbe141f: Status 404 returned error can't find the container with id b98a18bb0dc47b84c1492b2ac17a68adb27f2f41e39d059c6273ab986fbe141f Apr 17 18:10:04.976594 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.976569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-67qdn" Apr 17 18:10:04.982303 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.982279 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" Apr 17 18:10:04.984006 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:04.983968 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3728df74_67ba_42d4_88b0_a83eca2d9e0f.slice/crio-5683aad61d2ca07bfa55d5628f7dd3e52109ad746eef6f927ea656b22967a508 WatchSource:0}: Error finding container 5683aad61d2ca07bfa55d5628f7dd3e52109ad746eef6f927ea656b22967a508: Status 404 returned error can't find the container with id 5683aad61d2ca07bfa55d5628f7dd3e52109ad746eef6f927ea656b22967a508 Apr 17 18:10:04.990120 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:04.990091 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff23a89_3da2_420c_a6f3_bf94173e14c7.slice/crio-b832fd00a1d9f3b33d5fe1a45707883695e945dbb960310d668ec30c297ed34c WatchSource:0}: Error finding container b832fd00a1d9f3b33d5fe1a45707883695e945dbb960310d668ec30c297ed34c: Status 404 returned error can't find the container with id b832fd00a1d9f3b33d5fe1a45707883695e945dbb960310d668ec30c297ed34c Apr 17 18:10:04.999894 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:04.999863 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:05.006679 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:05.006649 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0473a77b_a212_4e59_806b_bcef01945958.slice/crio-8bc5ab618ee5a24c9a1554caa53e76311821dbfd07586ab1a5364ad71ac0572d WatchSource:0}: Error finding container 8bc5ab618ee5a24c9a1554caa53e76311821dbfd07586ab1a5364ad71ac0572d: Status 404 returned error can't find the container with id 8bc5ab618ee5a24c9a1554caa53e76311821dbfd07586ab1a5364ad71ac0572d Apr 17 18:10:05.014303 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.014281 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:05.030249 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.030226 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" Apr 17 18:10:05.037286 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:05.037255 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1416e0f9_d096_4822_907d_583c7506c232.slice/crio-5a89c1ed237fe2f34d27c4e69a2b9426bedb7c1cbabf2af7697ead665b18a866 WatchSource:0}: Error finding container 5a89c1ed237fe2f34d27c4e69a2b9426bedb7c1cbabf2af7697ead665b18a866: Status 404 returned error can't find the container with id 5a89c1ed237fe2f34d27c4e69a2b9426bedb7c1cbabf2af7697ead665b18a866 Apr 17 18:10:05.065306 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.065265 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v86sj" Apr 17 18:10:05.070271 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.070061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7bztg" Apr 17 18:10:05.071710 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:05.071683 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7949a229_9d6d_445d_938c_4147dc073aaf.slice/crio-5d44d53bd746fa82144fb2dd3c5517995295b04f30ec30f0e0515b29a53a56ec WatchSource:0}: Error finding container 5d44d53bd746fa82144fb2dd3c5517995295b04f30ec30f0e0515b29a53a56ec: Status 404 returned error can't find the container with id 5d44d53bd746fa82144fb2dd3c5517995295b04f30ec30f0e0515b29a53a56ec Apr 17 18:10:05.076740 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:05.076705 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda87beab3_c9c8_4166_b1ea_18aacdaa1b02.slice/crio-53319498fdf956eef4de65600720364808a31c810077ee5f495b87b0c306c3e4 WatchSource:0}: Error finding container 53319498fdf956eef4de65600720364808a31c810077ee5f495b87b0c306c3e4: Status 404 returned error can't find the container with id 53319498fdf956eef4de65600720364808a31c810077ee5f495b87b0c306c3e4 Apr 17 18:10:05.131481 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.131444 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:05.245693 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.245600 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:05.245693 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.245664 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:05.245924 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:05.245754 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:05.245924 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:05.245764 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:05.245924 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:05.245780 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:05.245924 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:05.245793 2576 projected.go:194] Error preparing data for projected volume kube-api-access-fg4vc for pod openshift-network-diagnostics/network-check-target-zm674: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:05.245924 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:05.245823 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:06.245804577 +0000 UTC m=+3.018279609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:05.245924 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:05.245841 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc podName:c4c3a5fb-6090-40b4-b79d-305cd89dd057 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:06.245832375 +0000 UTC m=+3.018307404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fg4vc" (UniqueName: "kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc") pod "network-check-target-zm674" (UID: "c4c3a5fb-6090-40b4-b79d-305cd89dd057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:05.410614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.410570 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:05.756278 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.756174 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:05:04 +0000 UTC" deadline="2027-12-23 23:22:29.847301104 +0000 UTC" Apr 17 18:10:05.756278 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.756228 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14765h12m24.091077341s" Apr 17 18:10:05.787228 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.787176 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:05.787410 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:05.787361 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:05.812863 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.812726 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerStarted","Data":"b832fd00a1d9f3b33d5fe1a45707883695e945dbb960310d668ec30c297ed34c"} Apr 17 18:10:05.823965 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.823863 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cm9j8" event={"ID":"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470","Type":"ContainerStarted","Data":"b98a18bb0dc47b84c1492b2ac17a68adb27f2f41e39d059c6273ab986fbe141f"} Apr 17 18:10:05.829447 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.829391 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fdxn5" event={"ID":"c650676f-6713-458e-b337-21b94770e9f5","Type":"ContainerStarted","Data":"136860f9bb25b97ccb591d34fe28970db11c7d43a8b75f5d9e45101fca9ccf8c"} Apr 17 18:10:05.846461 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.846419 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7bztg" event={"ID":"a87beab3-c9c8-4166-b1ea-18aacdaa1b02","Type":"ContainerStarted","Data":"53319498fdf956eef4de65600720364808a31c810077ee5f495b87b0c306c3e4"} Apr 17 18:10:05.856621 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.856580 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"43bb2e242275fb5b48160879e5d59ee665bdb06b3a9dfe8668a8bd9849bd27dd"} Apr 17 18:10:05.870346 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.870289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rpssj" event={"ID":"0473a77b-a212-4e59-806b-bcef01945958","Type":"ContainerStarted","Data":"8bc5ab618ee5a24c9a1554caa53e76311821dbfd07586ab1a5364ad71ac0572d"} Apr 17 18:10:05.875957 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.875914 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67qdn" event={"ID":"3728df74-67ba-42d4-88b0-a83eca2d9e0f","Type":"ContainerStarted","Data":"5683aad61d2ca07bfa55d5628f7dd3e52109ad746eef6f927ea656b22967a508"} Apr 17 18:10:05.896817 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.896749 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v86sj" event={"ID":"7949a229-9d6d-445d-938c-4147dc073aaf","Type":"ContainerStarted","Data":"5d44d53bd746fa82144fb2dd3c5517995295b04f30ec30f0e0515b29a53a56ec"} Apr 17 18:10:05.925252 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:05.925121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" event={"ID":"1416e0f9-d096-4822-907d-583c7506c232","Type":"ContainerStarted","Data":"5a89c1ed237fe2f34d27c4e69a2b9426bedb7c1cbabf2af7697ead665b18a866"} Apr 17 18:10:06.253806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:06.253712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:06.253806 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:06.253786 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:06.254026 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:06.253906 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:06.254026 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:06.253965 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:08.253947899 +0000 UTC m=+5.026422936 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:06.254408 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:06.254385 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:06.254408 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:06.254410 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:06.254545 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:06.254422 2576 projected.go:194] Error preparing data for projected volume kube-api-access-fg4vc for pod openshift-network-diagnostics/network-check-target-zm674: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:06.254545 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:06.254469 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc podName:c4c3a5fb-6090-40b4-b79d-305cd89dd057 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:08.254453241 +0000 UTC m=+5.026928271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fg4vc" (UniqueName: "kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc") pod "network-check-target-zm674" (UID: "c4c3a5fb-6090-40b4-b79d-305cd89dd057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:06.285522 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:06.285455 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 18:10:06.757088 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:06.757038 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 18:05:04 +0000 UTC" deadline="2028-01-01 06:19:22.157860314 +0000 UTC" Apr 17 18:10:06.757088 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:06.757083 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14964h9m15.400781097s" Apr 17 18:10:06.787457 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:06.787416 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:06.787640 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:06.787561 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:07.787349 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:07.787316 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:07.787855 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:07.787475 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:07.919253 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:07.918512 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-nns8d"] Apr 17 18:10:07.921941 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:07.921492 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:07.921941 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:07.921578 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:07.969105 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:07.969066 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-kubelet-config\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:07.969351 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:07.969149 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:07.969351 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:07.969248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-dbus\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:08.070142 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.070054 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-dbus\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:08.070142 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.070117 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-kubelet-config\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:08.070388 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.070163 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:08.070388 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.070345 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:08.070493 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.070413 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret podName:8d7c622e-a8b2-4ec5-b59a-62c39c3285bd nodeName:}" failed. No retries permitted until 2026-04-17 18:10:08.570393832 +0000 UTC m=+5.342868864 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret") pod "global-pull-secret-syncer-nns8d" (UID: "8d7c622e-a8b2-4ec5-b59a-62c39c3285bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:08.070818 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.070786 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-dbus\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:08.070914 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.070868 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-kubelet-config\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.271210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.271288 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.271410 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.271424 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.271434 2576 projected.go:194] Error preparing data for projected volume kube-api-access-fg4vc for pod openshift-network-diagnostics/network-check-target-zm674: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.271478 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc podName:c4c3a5fb-6090-40b4-b79d-305cd89dd057 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:12.271465065 +0000 UTC m=+9.043940095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fg4vc" (UniqueName: "kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc") pod "network-check-target-zm674" (UID: "c4c3a5fb-6090-40b4-b79d-305cd89dd057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.271825 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:08.271919 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.271882 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:12.271872097 +0000 UTC m=+9.044347128 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:08.574039 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.573896 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:08.574207 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.574080 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:08.574207 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.574142 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret podName:8d7c622e-a8b2-4ec5-b59a-62c39c3285bd nodeName:}" failed. No retries permitted until 2026-04-17 18:10:09.574127505 +0000 UTC m=+6.346602534 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret") pod "global-pull-secret-syncer-nns8d" (UID: "8d7c622e-a8b2-4ec5-b59a-62c39c3285bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:08.787715 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:08.787665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:08.788222 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:08.787813 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:09.583326 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:09.583280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:09.583514 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:09.583404 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:09.583514 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:09.583458 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret podName:8d7c622e-a8b2-4ec5-b59a-62c39c3285bd nodeName:}" failed. No retries permitted until 2026-04-17 18:10:11.583444324 +0000 UTC m=+8.355919358 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret") pod "global-pull-secret-syncer-nns8d" (UID: "8d7c622e-a8b2-4ec5-b59a-62c39c3285bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:09.790437 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:09.790351 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:09.790782 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:09.790494 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:09.790939 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:09.790921 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:09.791045 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:09.791021 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:10.787029 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:10.786989 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:10.787263 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:10.787126 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:11.600030 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:11.599987 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:11.600519 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:11.600152 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:11.600519 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:11.600245 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret podName:8d7c622e-a8b2-4ec5-b59a-62c39c3285bd nodeName:}" failed. No retries permitted until 2026-04-17 18:10:15.600225984 +0000 UTC m=+12.372701016 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret") pod "global-pull-secret-syncer-nns8d" (UID: "8d7c622e-a8b2-4ec5-b59a-62c39c3285bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:11.787050 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:11.787007 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:11.787243 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:11.787149 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:11.787710 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:11.787681 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:11.787835 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:11.787796 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:12.307011 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:12.306963 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:12.307276 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:12.307040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:12.307276 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:12.307167 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:12.307276 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:12.307255 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:20.307231659 +0000 UTC m=+17.079706696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:12.307729 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:12.307701 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:12.307729 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:12.307727 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:12.307847 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:12.307741 2576 projected.go:194] Error preparing data for projected volume kube-api-access-fg4vc for pod openshift-network-diagnostics/network-check-target-zm674: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:12.307847 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:12.307792 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc podName:c4c3a5fb-6090-40b4-b79d-305cd89dd057 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:20.307775851 +0000 UTC m=+17.080250883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fg4vc" (UniqueName: "kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc") pod "network-check-target-zm674" (UID: "c4c3a5fb-6090-40b4-b79d-305cd89dd057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:12.788152 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:12.787987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:12.788601 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:12.788274 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:13.789635 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:13.788372 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:13.789635 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:13.788520 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:13.789635 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:13.788575 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:13.789635 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:13.788718 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:14.788015 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:14.787977 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:14.788231 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:14.788108 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:15.631755 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:15.631652 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:15.632151 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:15.631826 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:15.632151 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:15.631904 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret podName:8d7c622e-a8b2-4ec5-b59a-62c39c3285bd nodeName:}" failed. No retries permitted until 2026-04-17 18:10:23.631881582 +0000 UTC m=+20.404356612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret") pod "global-pull-secret-syncer-nns8d" (UID: "8d7c622e-a8b2-4ec5-b59a-62c39c3285bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:15.787929 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:15.787885 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:15.788114 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:15.787892 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:15.788114 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:15.788010 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:15.788224 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:15.788128 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:16.787474 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:16.787432 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:16.787919 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:16.787573 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:17.787840 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:17.787799 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:17.788331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:17.787799 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:17.788331 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:17.787958 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:17.788331 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:17.788013 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:18.787107 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:18.787071 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:18.787324 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:18.787206 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:19.787672 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:19.787631 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:19.788146 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:19.787779 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:19.788146 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:19.787836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:19.788146 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:19.787955 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:20.369063 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:20.369014 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:20.369333 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:20.369077 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:20.369333 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:20.369212 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:20.369333 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:20.369213 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:20.369333 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:20.369243 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:20.369333 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:20.369259 2576 projected.go:194] Error preparing data for projected volume kube-api-access-fg4vc for pod openshift-network-diagnostics/network-check-target-zm674: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:20.369333 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:20.369282 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:36.369262317 +0000 UTC m=+33.141737346 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:20.369333 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:20.369305 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc podName:c4c3a5fb-6090-40b4-b79d-305cd89dd057 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:36.369293598 +0000 UTC m=+33.141768628 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fg4vc" (UniqueName: "kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc") pod "network-check-target-zm674" (UID: "c4c3a5fb-6090-40b4-b79d-305cd89dd057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:20.787673 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:20.787579 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:20.787817 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:20.787723 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:21.787485 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:21.787442 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:21.787782 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:21.787586 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:21.787782 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:21.787657 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:21.788171 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:21.787794 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:22.787660 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:22.787624 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:22.787848 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:22.787734 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:23.698498 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:23.698461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:23.698642 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:23.698618 2576 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:23.698703 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:23.698691 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret podName:8d7c622e-a8b2-4ec5-b59a-62c39c3285bd nodeName:}" failed. No retries permitted until 2026-04-17 18:10:39.698668319 +0000 UTC m=+36.471143370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret") pod "global-pull-secret-syncer-nns8d" (UID: "8d7c622e-a8b2-4ec5-b59a-62c39c3285bd") : object "kube-system"/"original-pull-secret" not registered Apr 17 18:10:23.788255 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:23.788223 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:23.788255 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:23.788248 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:23.788599 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:23.788351 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:23.788599 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:23.788441 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:24.788223 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.787897 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:24.788396 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:24.788316 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:24.967134 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.966725 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" event={"ID":"b6dc9341d6e3a6c508cdf128e57395dc","Type":"ContainerStarted","Data":"b2e9f3d2aa03de8b2a5039735d857685b9882bf0b905d9687a057e688e409e25"} Apr 17 18:10:24.971628 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.971598 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"20b029ee7b8440f590066fc11b6db0f14469b60f55c4b812ba8b776778917499"} Apr 17 18:10:24.971739 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.971640 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"49675a0ad2a9d53ba67aeae899c8dd4c06c9e30bd419672a7d4f0edb649e73d9"} Apr 17 18:10:24.971739 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.971655 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"97a33ceda316b8779eadb6ff3dbc97f11a4f14cf7898146be57175559d20e445"} Apr 17 18:10:24.971739 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.971667 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"d808199fedf75705b0630d8865bcb38c6bff886e8fd6c80ecf1598ce4d3a0082"} Apr 17 18:10:24.971739 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.971679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"97cd79acf5c15535ba7bce1cde1b3a5ffa9524b0532e83e792a412259a906c0b"} Apr 17 18:10:24.971739 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.971690 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"7031dd0630a680ee2e7aabf2ff601bd89138c6cf8a6f15921e589cb04b99a046"} Apr 17 18:10:24.977409 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.976268 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-67qdn" event={"ID":"3728df74-67ba-42d4-88b0-a83eca2d9e0f","Type":"ContainerStarted","Data":"91e50f988affdbdc0674f30a3bd1a34961941446665a4b8872e2e55aaf8d06c3"} Apr 17 18:10:24.977950 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.977922 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v86sj" event={"ID":"7949a229-9d6d-445d-938c-4147dc073aaf","Type":"ContainerStarted","Data":"2c6256730bad8ee6425570f55599bdbb7c25fc9b1d0d5dfb04808ee8d7797251"} Apr 17 18:10:24.979425 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.979379 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-134-83.ec2.internal" podStartSLOduration=20.979365403 podStartE2EDuration="20.979365403s" podCreationTimestamp="2026-04-17 18:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:10:24.978605137 +0000 UTC m=+21.751080190" watchObservedRunningTime="2026-04-17 18:10:24.979365403 +0000 UTC m=+21.751840455" Apr 17 18:10:24.993576 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:24.993498 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-67qdn" podStartSLOduration=3.161247032 podStartE2EDuration="21.9934804s" podCreationTimestamp="2026-04-17 18:10:03 +0000 UTC" firstStartedPulling="2026-04-17 18:10:04.986529836 +0000 UTC m=+1.759004869" lastFinishedPulling="2026-04-17 18:10:23.818763207 +0000 UTC m=+20.591238237" observedRunningTime="2026-04-17 18:10:24.992977593 +0000 UTC m=+21.765452648" watchObservedRunningTime="2026-04-17 18:10:24.9934804 +0000 UTC m=+21.765955453" Apr 17 18:10:25.008076 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.008029 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v86sj" podStartSLOduration=2.270812094 podStartE2EDuration="21.008013748s" podCreationTimestamp="2026-04-17 18:10:04 +0000 UTC" firstStartedPulling="2026-04-17 18:10:05.073462999 +0000 UTC m=+1.845938033" lastFinishedPulling="2026-04-17 18:10:23.810664646 +0000 UTC m=+20.583139687" observedRunningTime="2026-04-17 18:10:25.007651102 +0000 UTC m=+21.780126155" watchObservedRunningTime="2026-04-17 18:10:25.008013748 +0000 UTC m=+21.780488799" Apr 17 18:10:25.787853 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.787816 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:25.787853 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.787836 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:25.788129 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:25.787961 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:25.788129 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:25.788042 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:25.981514 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.981483 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7bztg" event={"ID":"a87beab3-c9c8-4166-b1ea-18aacdaa1b02","Type":"ContainerStarted","Data":"d9a7ddffe6724b122ab6e4968d5241038a4db36e7ad574c4ca4c22883fe18c75"} Apr 17 18:10:25.982796 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.982768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rpssj" event={"ID":"0473a77b-a212-4e59-806b-bcef01945958","Type":"ContainerStarted","Data":"d7bbf629d6262559048c49e8fae7fd57f831f220233387e14bb8c2a1c49f039d"} Apr 17 18:10:25.984080 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.984055 2576 generic.go:358] "Generic (PLEG): container finished" podID="85c4607f0afe2df9f8ce864e9d7b1bd7" containerID="6115b9b95d0cf043ddab580ca679ae4ab1ff159ce22f7f1ded5c30b7905165ce" exitCode=0 Apr 17 18:10:25.984206 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.984120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" event={"ID":"85c4607f0afe2df9f8ce864e9d7b1bd7","Type":"ContainerDied","Data":"6115b9b95d0cf043ddab580ca679ae4ab1ff159ce22f7f1ded5c30b7905165ce"} Apr 17 18:10:25.985551 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.985524 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" event={"ID":"1416e0f9-d096-4822-907d-583c7506c232","Type":"ContainerStarted","Data":"6b8a14f74446a69e89189828f1c6b83673b93b02833d6c1b8220c7bfd042e706"} Apr 17 18:10:25.987515 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.987302 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ff23a89-3da2-420c-a6f3-bf94173e14c7" containerID="85d56ca20efb35e3078a72fbc06e711db8989100cf159da14de3612b21cf240e" exitCode=0 Apr 17 18:10:25.987515 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.987372 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerDied","Data":"85d56ca20efb35e3078a72fbc06e711db8989100cf159da14de3612b21cf240e"} Apr 17 18:10:25.990539 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.990496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-cm9j8" event={"ID":"d2f215a7-d8e6-4b38-bd88-ad6bf1f07470","Type":"ContainerStarted","Data":"3efcd5857d0c94bfe6e3c398da5bf1bae9dbfffabe023336a0a1de866cff6c0b"} Apr 17 18:10:25.992307 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:25.992211 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fdxn5" event={"ID":"c650676f-6713-458e-b337-21b94770e9f5","Type":"ContainerStarted","Data":"989d8df026fcd0c209f683424ea05195e665e49bfb9649b270d4e7502090c583"} Apr 17 18:10:26.004555 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.004430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7bztg" podStartSLOduration=3.274971668 podStartE2EDuration="22.004410543s" podCreationTimestamp="2026-04-17 18:10:04 +0000 UTC" firstStartedPulling="2026-04-17 18:10:05.078140089 +0000 UTC m=+1.850615120" lastFinishedPulling="2026-04-17 18:10:23.807578952 +0000 UTC m=+20.580053995" observedRunningTime="2026-04-17 18:10:26.003786647 +0000 UTC m=+22.776261699" watchObservedRunningTime="2026-04-17 18:10:26.004410543 +0000 UTC m=+22.776885596" Apr 17 18:10:26.005728 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.005692 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 18:10:26.016825 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.016766 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rpssj" podStartSLOduration=4.267424427 podStartE2EDuration="23.016747383s" podCreationTimestamp="2026-04-17 18:10:03 +0000 UTC" firstStartedPulling="2026-04-17 18:10:05.008288018 +0000 UTC m=+1.780763049" lastFinishedPulling="2026-04-17 18:10:23.757610975 +0000 UTC m=+20.530086005" observedRunningTime="2026-04-17 18:10:26.016310076 +0000 UTC m=+22.788785129" watchObservedRunningTime="2026-04-17 18:10:26.016747383 +0000 UTC m=+22.789222434" Apr 17 18:10:26.041961 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.041845 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fdxn5" podStartSLOduration=4.239433192 podStartE2EDuration="23.041825918s" podCreationTimestamp="2026-04-17 18:10:03 +0000 UTC" firstStartedPulling="2026-04-17 18:10:04.955220116 +0000 UTC m=+1.727695145" lastFinishedPulling="2026-04-17 18:10:23.757612842 +0000 UTC m=+20.530087871" observedRunningTime="2026-04-17 18:10:26.041780635 +0000 UTC m=+22.814255689" watchObservedRunningTime="2026-04-17 18:10:26.041825918 +0000 UTC m=+22.814300962" Apr 17 18:10:26.054043 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.053982 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-cm9j8" podStartSLOduration=4.239320088 podStartE2EDuration="23.053961541s" podCreationTimestamp="2026-04-17 18:10:03 +0000 UTC" firstStartedPulling="2026-04-17 18:10:04.970919883 +0000 UTC m=+1.743394912" lastFinishedPulling="2026-04-17 18:10:23.785561336 +0000 UTC m=+20.558036365" observedRunningTime="2026-04-17 18:10:26.05354195 +0000 UTC m=+22.826017014" watchObservedRunningTime="2026-04-17 18:10:26.053961541 +0000 UTC m=+22.826436595" Apr 17 18:10:26.705433 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.705310 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T18:10:26.005711907Z","UUID":"01e981f5-309d-4afb-909a-b70ee37d5c36","Handler":null,"Name":"","Endpoint":""} Apr 17 18:10:26.707148 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.707124 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 18:10:26.707148 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.707154 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 18:10:26.787955 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.787911 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:26.788138 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:26.788033 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:26.997079 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:26.997039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" event={"ID":"85c4607f0afe2df9f8ce864e9d7b1bd7","Type":"ContainerStarted","Data":"d6b14fd6aac4602a1e52db32d8b0d3e304f819a3bdc61b6036f43b1f4be22af1"} Apr 17 18:10:27.001014 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:27.000975 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" event={"ID":"1416e0f9-d096-4822-907d-583c7506c232","Type":"ContainerStarted","Data":"cbc4711c340bcdd3d86541400f38a4013b6d8e3adfe824ff033cc6e06f73174e"} Apr 17 18:10:27.001174 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:27.001029 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" event={"ID":"1416e0f9-d096-4822-907d-583c7506c232","Type":"ContainerStarted","Data":"5af12351d3d6f25c0587def6576ef981afb7ad8f8febe8191b7326c3c71a5260"} Apr 17 18:10:27.005001 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:27.004963 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"d3c635e32bf588f6f290077d3caf6833bc07e36d35c9aeb26fdc77dd85b57fc3"} Apr 17 18:10:27.012237 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:27.012155 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-134-83.ec2.internal" podStartSLOduration=23.012134807 podStartE2EDuration="23.012134807s" podCreationTimestamp="2026-04-17 18:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:10:27.011583293 +0000 UTC m=+23.784058344" watchObservedRunningTime="2026-04-17 18:10:27.012134807 +0000 UTC m=+23.784609863" Apr 17 18:10:27.031243 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:27.031026 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kp9nl" podStartSLOduration=1.220255987 podStartE2EDuration="23.03100972s" podCreationTimestamp="2026-04-17 18:10:04 +0000 UTC" firstStartedPulling="2026-04-17 18:10:05.0389328 +0000 UTC m=+1.811407830" lastFinishedPulling="2026-04-17 18:10:26.849686518 +0000 UTC m=+23.622161563" observedRunningTime="2026-04-17 18:10:27.030625618 +0000 UTC m=+23.803100671" watchObservedRunningTime="2026-04-17 18:10:27.03100972 +0000 UTC m=+23.803484772" Apr 17 18:10:27.787402 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:27.787367 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:27.787551 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:27.787404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:27.787551 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:27.787467 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:27.787642 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:27.787595 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:28.787308 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:28.787212 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:28.787842 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:28.787361 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:29.079594 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:29.079487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:29.787607 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:29.787569 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:29.788120 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:29.787703 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:29.788120 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:29.787755 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:29.788120 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:29.787877 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:29.828683 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:29.828640 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:29.829440 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:29.829417 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:30.011676 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:30.011648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rpssj" Apr 17 18:10:30.787848 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:30.787659 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:30.788371 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:30.787935 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:31.020047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.019468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" event={"ID":"0c89c182-082a-4aa5-95e4-40fa3cd0c63d","Type":"ContainerStarted","Data":"02c6130b2618b8b00ff09de7ff5d1eedcb195554700b5d36a301bd7a3e259f5c"} Apr 17 18:10:31.020047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.019539 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:31.020047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.019648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:31.020047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.019744 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:31.039951 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.039895 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:31.040162 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.040143 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:10:31.046059 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.045996 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" podStartSLOduration=7.820852752 podStartE2EDuration="27.045977194s" podCreationTimestamp="2026-04-17 18:10:04 +0000 UTC" firstStartedPulling="2026-04-17 18:10:05.024769256 +0000 UTC m=+1.797244286" lastFinishedPulling="2026-04-17 18:10:24.249893688 +0000 UTC m=+21.022368728" observedRunningTime="2026-04-17 18:10:31.045656086 +0000 UTC m=+27.818131138" watchObservedRunningTime="2026-04-17 18:10:31.045977194 +0000 UTC m=+27.818452247" Apr 17 18:10:31.787013 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.786976 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:31.787199 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:31.786987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:31.787199 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:31.787095 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:31.787285 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:31.787219 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:32.022690 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.022655 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ff23a89-3da2-420c-a6f3-bf94173e14c7" containerID="d73ad9e4c54c4a8cc148ab65cb315e40a3fb09ef662f35357d0f22572ff89d1b" exitCode=0 Apr 17 18:10:32.023136 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.022735 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerDied","Data":"d73ad9e4c54c4a8cc148ab65cb315e40a3fb09ef662f35357d0f22572ff89d1b"} Apr 17 18:10:32.660704 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.660663 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nns8d"] Apr 17 18:10:32.660897 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.660808 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:32.660964 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:32.660901 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:32.670427 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.670389 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nmf4d"] Apr 17 18:10:32.670633 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.670534 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:32.670694 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:32.670640 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:32.671195 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.671159 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zm674"] Apr 17 18:10:32.671299 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:32.671286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:32.671388 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:32.671369 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:34.029483 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:34.029249 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ff23a89-3da2-420c-a6f3-bf94173e14c7" containerID="7b12ed60a43e67c653ec481dc25f43e153e6e6b925a91686ff57a7bf15e134b5" exitCode=0 Apr 17 18:10:34.029483 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:34.029340 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerDied","Data":"7b12ed60a43e67c653ec481dc25f43e153e6e6b925a91686ff57a7bf15e134b5"} Apr 17 18:10:34.787023 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:34.786991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:34.787142 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:34.786998 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:34.787142 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:34.787115 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:34.787247 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:34.787013 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:34.787295 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:34.787270 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:34.787295 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:34.787176 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:35.034484 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:35.034391 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ff23a89-3da2-420c-a6f3-bf94173e14c7" containerID="26104b9046f31ac0a705209221bf7cd4b5ff45ada7389f65196e7951ea3583f6" exitCode=0 Apr 17 18:10:35.034484 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:35.034449 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerDied","Data":"26104b9046f31ac0a705209221bf7cd4b5ff45ada7389f65196e7951ea3583f6"} Apr 17 18:10:36.401051 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:36.401015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:36.401645 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:36.401073 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:36.401645 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.401193 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:36.401645 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.401219 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 18:10:36.401645 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.401243 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 18:10:36.401645 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.401255 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:08.401237715 +0000 UTC m=+65.173712751 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 18:10:36.401645 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.401257 2576 projected.go:194] Error preparing data for projected volume kube-api-access-fg4vc for pod openshift-network-diagnostics/network-check-target-zm674: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:36.401645 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.401334 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc podName:c4c3a5fb-6090-40b4-b79d-305cd89dd057 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:08.401300623 +0000 UTC m=+65.173775665 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-fg4vc" (UniqueName: "kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc") pod "network-check-target-zm674" (UID: "c4c3a5fb-6090-40b4-b79d-305cd89dd057") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 18:10:36.787833 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:36.787736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:36.787833 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:36.787797 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:36.787833 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:36.787736 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:36.788108 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.787874 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zm674" podUID="c4c3a5fb-6090-40b4-b79d-305cd89dd057" Apr 17 18:10:36.788108 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.787991 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-nns8d" podUID="8d7c622e-a8b2-4ec5-b59a-62c39c3285bd" Apr 17 18:10:36.788108 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:36.788071 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:10:37.103549 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.103467 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-134-83.ec2.internal" event="NodeReady" Apr 17 18:10:37.103699 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.103620 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 18:10:37.134994 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.134960 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-8467bd764f-4w7nb"] Apr 17 18:10:37.138922 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.138889 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh"] Apr 17 18:10:37.143575 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.143529 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf"] Apr 17 18:10:37.144074 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.144051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.144141 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.144119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.146552 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.146516 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 18:10:37.146552 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.146541 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 18:10:37.146759 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.146567 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 18:10:37.146759 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.146622 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 18:10:37.146759 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.146547 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 18:10:37.146913 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.146781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 18:10:37.147136 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.147120 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wbd55\"" Apr 17 18:10:37.147622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.147602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 18:10:37.149212 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.149156 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk"] Apr 17 18:10:37.149347 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.149313 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.153206 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.152274 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 17 18:10:37.153206 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.152661 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 17 18:10:37.153206 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.152737 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 17 18:10:37.153206 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.152817 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 17 18:10:37.153541 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.153214 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j"] Apr 17 18:10:37.153541 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.153383 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.155466 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.155439 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 18:10:37.155889 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.155770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-knrwx\"" Apr 17 18:10:37.156333 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.156313 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 18:10:37.156863 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.156841 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8467bd764f-4w7nb"] Apr 17 18:10:37.156946 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.156870 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j"] Apr 17 18:10:37.156946 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.156882 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh"] Apr 17 18:10:37.156946 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.156897 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7b5fn"] Apr 17 18:10:37.157098 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.156984 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:37.159048 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.159026 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tsj6l\"" Apr 17 18:10:37.159048 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.159026 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 18:10:37.159315 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.159300 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 18:10:37.160796 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.160779 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.163659 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.163638 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 18:10:37.163782 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.163669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 18:10:37.163845 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.163796 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-46dtx\"" Apr 17 18:10:37.168304 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.168275 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk"] Apr 17 18:10:37.169811 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.169634 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf"] Apr 17 18:10:37.170510 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.170490 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7b5fn"] Apr 17 18:10:37.256369 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.256335 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xmchg"] Apr 17 18:10:37.261102 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.261061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:37.263460 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.263430 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hwgzd\"" Apr 17 18:10:37.263460 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.263460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 18:10:37.263668 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.263435 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 18:10:37.263668 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.263616 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 18:10:37.276735 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.276702 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmchg"] Apr 17 18:10:37.309410 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309374 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.309410 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309415 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-installation-pull-secrets\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.309678 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-bound-sa-token\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.309678 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309502 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-tmp-dir\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.309678 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309615 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/68e5f44a-3a60-429a-b683-8a89ca872d0e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6498468c77-hdltk\" (UID: \"68e5f44a-3a60-429a-b683-8a89ca872d0e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.309678 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309653 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-hub\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.309882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309681 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:37.309882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309711 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-image-registry-private-configuration\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.309882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-config-volume\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.309882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309754 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.309882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309774 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qzst\" (UniqueName: \"kubernetes.io/projected/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-kube-api-access-2qzst\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.309882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309835 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a2d45c46-4c77-4525-aba9-4863a1b296ee-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.309882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309861 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4518d856-34d4-4abd-a245-c368bbffa021-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309888 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cee48042-1c4c-4a0d-b965-f2145692570e-ca-trust-extracted\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309945 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bzj\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-kube-api-access-97bzj\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.309980 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-ca\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310005 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310021 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffhtx\" (UniqueName: \"kubernetes.io/projected/a2d45c46-4c77-4525-aba9-4863a1b296ee-kube-api-access-ffhtx\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310044 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zns5v\" (UniqueName: \"kubernetes.io/projected/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-kube-api-access-zns5v\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310070 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftrv\" (UniqueName: \"kubernetes.io/projected/68e5f44a-3a60-429a-b683-8a89ca872d0e-kube-api-access-vftrv\") pod \"managed-serviceaccount-addon-agent-6498468c77-hdltk\" (UID: \"68e5f44a-3a60-429a-b683-8a89ca872d0e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310093 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-tmp\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310136 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-registry-certificates\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.310236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310175 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-trusted-ca\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.310716 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.310716 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.310279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.411671 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffhtx\" (UniqueName: \"kubernetes.io/projected/a2d45c46-4c77-4525-aba9-4863a1b296ee-kube-api-access-ffhtx\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.411671 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411638 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-tmp-dir\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-bound-sa-token\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411705 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-installation-pull-secrets\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411801 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-hub\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411829 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-image-registry-private-configuration\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.411841 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.411923 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:37.911900798 +0000 UTC m=+34.684375834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.411950 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.412008 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:37.911991711 +0000 UTC m=+34.684466758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412117 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-tmp-dir\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.411951 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4518d856-34d4-4abd-a245-c368bbffa021-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a2d45c46-4c77-4525-aba9-4863a1b296ee-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412266 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-ca\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.412331 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjms4\" (UniqueName: \"kubernetes.io/projected/057fe092-ea34-41d4-a4dc-e565cd2567bf-kube-api-access-rjms4\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412529 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zns5v\" (UniqueName: \"kubernetes.io/projected/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-kube-api-access-zns5v\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-trusted-ca\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vftrv\" (UniqueName: \"kubernetes.io/projected/68e5f44a-3a60-429a-b683-8a89ca872d0e-kube-api-access-vftrv\") pod \"managed-serviceaccount-addon-agent-6498468c77-hdltk\" (UID: \"68e5f44a-3a60-429a-b683-8a89ca872d0e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-registry-certificates\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-tmp\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.412651 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412661 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.412670 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/68e5f44a-3a60-429a-b683-8a89ca872d0e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6498468c77-hdltk\" (UID: \"68e5f44a-3a60-429a-b683-8a89ca872d0e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412696 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-config-volume\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4518d856-34d4-4abd-a245-c368bbffa021-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qzst\" (UniqueName: \"kubernetes.io/projected/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-kube-api-access-2qzst\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.413065 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.412749 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:37.91273213 +0000 UTC m=+34.685207160 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:10:37.413846 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.412956 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/a2d45c46-4c77-4525-aba9-4863a1b296ee-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.413846 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.413030 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-tmp\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.413846 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.413401 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-config-volume\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.414746 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.414215 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-trusted-ca\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.416016 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.415675 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-registry-certificates\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.417763 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-image-registry-private-configuration\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.417763 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417623 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-ca\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.417763 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-hub\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.417763 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417757 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:37.418006 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cee48042-1c4c-4a0d-b965-f2145692570e-ca-trust-extracted\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.418006 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97bzj\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-kube-api-access-97bzj\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.418006 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-installation-pull-secrets\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.418006 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.417968 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/68e5f44a-3a60-429a-b683-8a89ca872d0e-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6498468c77-hdltk\" (UID: \"68e5f44a-3a60-429a-b683-8a89ca872d0e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.418745 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.418716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.418745 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.418736 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a2d45c46-4c77-4525-aba9-4863a1b296ee-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.418975 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.418960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cee48042-1c4c-4a0d-b965-f2145692570e-ca-trust-extracted\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.420258 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.420216 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-klusterlet-config\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.421823 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.421757 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-bound-sa-token\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.422697 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.422341 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zns5v\" (UniqueName: \"kubernetes.io/projected/e2c31d05-fcef-4477-8a44-6c7fe9073a7b-kube-api-access-zns5v\") pod \"klusterlet-addon-workmgr-7b8665db88-nsgqh\" (UID: \"e2c31d05-fcef-4477-8a44-6c7fe9073a7b\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.422697 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.422472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffhtx\" (UniqueName: \"kubernetes.io/projected/a2d45c46-4c77-4525-aba9-4863a1b296ee-kube-api-access-ffhtx\") pod \"cluster-proxy-proxy-agent-f6fb56456-f6skf\" (UID: \"a2d45c46-4c77-4525-aba9-4863a1b296ee\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.422697 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.422565 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qzst\" (UniqueName: \"kubernetes.io/projected/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-kube-api-access-2qzst\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.422697 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.422665 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftrv\" (UniqueName: \"kubernetes.io/projected/68e5f44a-3a60-429a-b683-8a89ca872d0e-kube-api-access-vftrv\") pod \"managed-serviceaccount-addon-agent-6498468c77-hdltk\" (UID: \"68e5f44a-3a60-429a-b683-8a89ca872d0e\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.428062 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.428034 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bzj\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-kube-api-access-97bzj\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.460321 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.460272 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:37.498470 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.498426 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:10:37.506490 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.506455 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:10:37.518820 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.518441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:37.518820 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.518535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rjms4\" (UniqueName: \"kubernetes.io/projected/057fe092-ea34-41d4-a4dc-e565cd2567bf-kube-api-access-rjms4\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:37.518820 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.518589 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:37.518820 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.518748 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:10:38.018724419 +0000 UTC m=+34.791199462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:10:37.528723 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.528213 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjms4\" (UniqueName: \"kubernetes.io/projected/057fe092-ea34-41d4-a4dc-e565cd2567bf-kube-api-access-rjms4\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:37.640259 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.639990 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh"] Apr 17 18:10:37.644717 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:37.644681 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c31d05_fcef_4477_8a44_6c7fe9073a7b.slice/crio-16e5968e1bf7c48e2d74df302825941b81ce636f885f89e4364beddec2341dfc WatchSource:0}: Error finding container 16e5968e1bf7c48e2d74df302825941b81ce636f885f89e4364beddec2341dfc: Status 404 returned error can't find the container with id 16e5968e1bf7c48e2d74df302825941b81ce636f885f89e4364beddec2341dfc Apr 17 18:10:37.662430 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.662292 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf"] Apr 17 18:10:37.671004 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:37.670969 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d45c46_4c77_4525_aba9_4863a1b296ee.slice/crio-15579a1e4930ba1ce41c7d720d28d160fa6962a371a8ca6ffd537385db1eec8e WatchSource:0}: Error finding container 15579a1e4930ba1ce41c7d720d28d160fa6962a371a8ca6ffd537385db1eec8e: Status 404 returned error can't find the container with id 15579a1e4930ba1ce41c7d720d28d160fa6962a371a8ca6ffd537385db1eec8e Apr 17 18:10:37.684286 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.684238 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk"] Apr 17 18:10:37.689253 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:37.689216 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68e5f44a_3a60_429a_b683_8a89ca872d0e.slice/crio-91adc152257bc45d8e371772e786369a1a70383836b768f136e281aef59e51f0 WatchSource:0}: Error finding container 91adc152257bc45d8e371772e786369a1a70383836b768f136e281aef59e51f0: Status 404 returned error can't find the container with id 91adc152257bc45d8e371772e786369a1a70383836b768f136e281aef59e51f0 Apr 17 18:10:37.922424 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.922335 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:37.922424 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.922388 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:37.922606 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:37.922441 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:37.922606 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.922507 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:37.922606 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.922541 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:10:37.922606 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.922579 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:10:37.922606 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.922594 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:10:37.922771 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.922583 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:38.922566345 +0000 UTC m=+35.695041375 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:10:37.922771 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.922656 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:38.922639767 +0000 UTC m=+35.695114807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:10:37.922771 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:37.922673 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:38.922663208 +0000 UTC m=+35.695138243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:10:38.023454 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.023331 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:38.023656 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.023492 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:38.023656 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.023576 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:10:39.023554759 +0000 UTC m=+35.796029809 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:10:38.041667 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.041624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" event={"ID":"e2c31d05-fcef-4477-8a44-6c7fe9073a7b","Type":"ContainerStarted","Data":"16e5968e1bf7c48e2d74df302825941b81ce636f885f89e4364beddec2341dfc"} Apr 17 18:10:38.042679 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.042656 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" event={"ID":"a2d45c46-4c77-4525-aba9-4863a1b296ee","Type":"ContainerStarted","Data":"15579a1e4930ba1ce41c7d720d28d160fa6962a371a8ca6ffd537385db1eec8e"} Apr 17 18:10:38.043620 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.043595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" event={"ID":"68e5f44a-3a60-429a-b683-8a89ca872d0e","Type":"ContainerStarted","Data":"91adc152257bc45d8e371772e786369a1a70383836b768f136e281aef59e51f0"} Apr 17 18:10:38.787595 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.787556 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:38.788279 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.788026 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:10:38.788541 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.788521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:10:38.793252 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.793222 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-c85k4\"" Apr 17 18:10:38.793544 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.793496 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:10:38.793793 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.793775 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 18:10:38.794078 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.794058 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:10:38.794404 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.794383 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:10:38.794656 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.794638 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nnqt\"" Apr 17 18:10:38.933152 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.933111 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:38.933355 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.933208 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:38.933355 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:38.933322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:38.933473 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.933454 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:38.933538 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.933526 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:40.933506934 +0000 UTC m=+37.705981970 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:10:38.933973 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.933948 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:10:38.934076 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.934009 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:40.933992339 +0000 UTC m=+37.706467369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:10:38.934143 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.934099 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:10:38.934143 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.934113 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:10:38.934291 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:38.934148 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:40.934136764 +0000 UTC m=+37.706611808 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:10:39.033850 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:39.033804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:39.034054 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:39.034001 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:39.034164 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:39.034073 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:10:41.034051043 +0000 UTC m=+37.806526093 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:10:39.740988 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:39.740280 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:39.757866 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:39.757788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8d7c622e-a8b2-4ec5-b59a-62c39c3285bd-original-pull-secret\") pod \"global-pull-secret-syncer-nns8d\" (UID: \"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd\") " pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:40.009895 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:40.008971 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-nns8d" Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:40.953234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:40.953301 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:40.953357 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:40.953411 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:40.953434 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:40.953492 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:44.9534722 +0000 UTC m=+41.725947230 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:40.953507 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:44.953501196 +0000 UTC m=+41.725976225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:40.953520 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:40.953537 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:10:40.953616 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:40.953586 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:44.953569901 +0000 UTC m=+41.726044934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:10:41.054766 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:41.054250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:41.054766 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:41.054390 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:41.054766 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:41.054443 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:10:45.054428114 +0000 UTC m=+41.826903166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:10:44.990012 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:44.989959 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:44.990028 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:44.990082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:44.990132 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:44.990203 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:44.990238 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:52.990215979 +0000 UTC m=+49.762691032 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:44.990255 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:10:52.990248383 +0000 UTC m=+49.762723413 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:44.990274 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:44.990289 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:10:44.990475 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:44.990341 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:10:52.990324066 +0000 UTC m=+49.762799115 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:10:45.091084 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:45.091040 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:45.091283 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:45.091176 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:45.091283 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:45.091276 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:10:53.091254613 +0000 UTC m=+49.863729647 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:10:46.389505 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:46.388844 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-nns8d"] Apr 17 18:10:46.517541 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:10:46.517503 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d7c622e_a8b2_4ec5_b59a_62c39c3285bd.slice/crio-22175ed90dad0fc4f3f186bec22d014a61baa044570027fcd6c40728a978248a WatchSource:0}: Error finding container 22175ed90dad0fc4f3f186bec22d014a61baa044570027fcd6c40728a978248a: Status 404 returned error can't find the container with id 22175ed90dad0fc4f3f186bec22d014a61baa044570027fcd6c40728a978248a Apr 17 18:10:47.071074 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.071023 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" event={"ID":"68e5f44a-3a60-429a-b683-8a89ca872d0e","Type":"ContainerStarted","Data":"4c6ca93f9328675f4826424ed86e71550abf133acad4974f617192ba66e9a1ba"} Apr 17 18:10:47.072585 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.072541 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" event={"ID":"e2c31d05-fcef-4477-8a44-6c7fe9073a7b","Type":"ContainerStarted","Data":"62c4e261bf2e82ab0af278492fc870328dd4ec7d3b97bacf9364b1884c8e39eb"} Apr 17 18:10:47.072779 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.072760 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:47.074021 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.073991 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" event={"ID":"a2d45c46-4c77-4525-aba9-4863a1b296ee","Type":"ContainerStarted","Data":"3ac6564a61385914027ff51cbf63aeb00847df5849a376d6ad82fd3668562e1b"} Apr 17 18:10:47.074753 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.074728 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:10:47.075236 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.075205 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nns8d" event={"ID":"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd","Type":"ContainerStarted","Data":"22175ed90dad0fc4f3f186bec22d014a61baa044570027fcd6c40728a978248a"} Apr 17 18:10:47.078305 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.078266 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ff23a89-3da2-420c-a6f3-bf94173e14c7" containerID="76f68187612cc661e259a4897fea6b5ee37ff0fd1feb0c32018bbcce5e3ef520" exitCode=0 Apr 17 18:10:47.078412 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.078323 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerDied","Data":"76f68187612cc661e259a4897fea6b5ee37ff0fd1feb0c32018bbcce5e3ef520"} Apr 17 18:10:47.107920 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.107850 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" podStartSLOduration=18.532150504 podStartE2EDuration="27.107832695s" podCreationTimestamp="2026-04-17 18:10:20 +0000 UTC" firstStartedPulling="2026-04-17 18:10:37.691390362 +0000 UTC m=+34.463865394" lastFinishedPulling="2026-04-17 18:10:46.267072544 +0000 UTC m=+43.039547585" observedRunningTime="2026-04-17 18:10:47.107712894 +0000 UTC m=+43.880187949" watchObservedRunningTime="2026-04-17 18:10:47.107832695 +0000 UTC m=+43.880307749" Apr 17 18:10:47.177552 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:47.177501 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" podStartSLOduration=18.281134995 podStartE2EDuration="27.177484509s" podCreationTimestamp="2026-04-17 18:10:20 +0000 UTC" firstStartedPulling="2026-04-17 18:10:37.64703964 +0000 UTC m=+34.419514673" lastFinishedPulling="2026-04-17 18:10:46.543389152 +0000 UTC m=+43.315864187" observedRunningTime="2026-04-17 18:10:47.176678022 +0000 UTC m=+43.949153097" watchObservedRunningTime="2026-04-17 18:10:47.177484509 +0000 UTC m=+43.949959560" Apr 17 18:10:48.085083 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:48.085037 2576 generic.go:358] "Generic (PLEG): container finished" podID="8ff23a89-3da2-420c-a6f3-bf94173e14c7" containerID="0e47af1fe8fdc02a22d9fb18294cebe02088fca43832d127c944cb2bff51bab8" exitCode=0 Apr 17 18:10:48.085576 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:48.085239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerDied","Data":"0e47af1fe8fdc02a22d9fb18294cebe02088fca43832d127c944cb2bff51bab8"} Apr 17 18:10:49.091133 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:49.091093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" event={"ID":"8ff23a89-3da2-420c-a6f3-bf94173e14c7","Type":"ContainerStarted","Data":"0a7d4c3fc122b5b0a82f69694fb425310e5b7e9fb15056cec21565c29f385a4e"} Apr 17 18:10:49.121129 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:49.121080 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zhcv2" podStartSLOduration=4.853856043 podStartE2EDuration="46.121060447s" podCreationTimestamp="2026-04-17 18:10:03 +0000 UTC" firstStartedPulling="2026-04-17 18:10:04.992133042 +0000 UTC m=+1.764608071" lastFinishedPulling="2026-04-17 18:10:46.259337407 +0000 UTC m=+43.031812475" observedRunningTime="2026-04-17 18:10:49.117862304 +0000 UTC m=+45.890337356" watchObservedRunningTime="2026-04-17 18:10:49.121060447 +0000 UTC m=+45.893535500" Apr 17 18:10:50.094753 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:50.094714 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" event={"ID":"a2d45c46-4c77-4525-aba9-4863a1b296ee","Type":"ContainerStarted","Data":"ee05c669224f2177e464c029a19134296a8c435a57e4ca0b02bfc537e9a249d2"} Apr 17 18:10:50.094753 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:50.094756 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" event={"ID":"a2d45c46-4c77-4525-aba9-4863a1b296ee","Type":"ContainerStarted","Data":"4b6364967fe304f64455da18dfe8c06b5ef3bd0a181044a6366d52b8d42a6519"} Apr 17 18:10:50.114344 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:50.114293 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" podStartSLOduration=18.719637256 podStartE2EDuration="30.114277042s" podCreationTimestamp="2026-04-17 18:10:20 +0000 UTC" firstStartedPulling="2026-04-17 18:10:37.673691602 +0000 UTC m=+34.446166647" lastFinishedPulling="2026-04-17 18:10:49.068331403 +0000 UTC m=+45.840806433" observedRunningTime="2026-04-17 18:10:50.113093709 +0000 UTC m=+46.885568752" watchObservedRunningTime="2026-04-17 18:10:50.114277042 +0000 UTC m=+46.886752094" Apr 17 18:10:53.069376 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:53.069329 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:10:53.069376 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:53.069383 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:53.069421 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.069492 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.069510 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.069559 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:11:09.069542953 +0000 UTC m=+65.842017983 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.069575 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:09.069569147 +0000 UTC m=+65.842044177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.069595 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.069618 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:10:53.069884 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.069675 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:11:09.069656948 +0000 UTC m=+65.842131978 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:10:53.103564 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:53.103528 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-nns8d" event={"ID":"8d7c622e-a8b2-4ec5-b59a-62c39c3285bd","Type":"ContainerStarted","Data":"ebcb1cffce4af46057d7e0f82ef0df9fa6ab01b1dcf6ce5339373d894197a6d0"} Apr 17 18:10:53.116728 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:53.116678 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-nns8d" podStartSLOduration=40.634679352 podStartE2EDuration="46.116663191s" podCreationTimestamp="2026-04-17 18:10:07 +0000 UTC" firstStartedPulling="2026-04-17 18:10:46.524453078 +0000 UTC m=+43.296928109" lastFinishedPulling="2026-04-17 18:10:52.006436914 +0000 UTC m=+48.778911948" observedRunningTime="2026-04-17 18:10:53.116377364 +0000 UTC m=+49.888852416" watchObservedRunningTime="2026-04-17 18:10:53.116663191 +0000 UTC m=+49.889138242" Apr 17 18:10:53.170914 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:10:53.170863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:10:53.171080 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.170991 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:10:53.171080 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:10:53.171062 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:11:09.171043796 +0000 UTC m=+65.943518841 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:11:03.036771 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:03.036740 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5w6tv" Apr 17 18:11:08.495620 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.495574 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:11:08.495620 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.495630 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:11:08.498292 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.498266 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 18:11:08.498357 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.498299 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 18:11:08.506280 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:08.506241 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:11:08.506388 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:08.506339 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:12.506320937 +0000 UTC m=+129.278795971 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : secret "metrics-daemon-secret" not found Apr 17 18:11:08.508438 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.508421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 18:11:08.520432 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.520396 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fg4vc\" (UniqueName: \"kubernetes.io/projected/c4c3a5fb-6090-40b4-b79d-305cd89dd057-kube-api-access-fg4vc\") pod \"network-check-target-zm674\" (UID: \"c4c3a5fb-6090-40b4-b79d-305cd89dd057\") " pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:11:08.537712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.537674 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5nnqt\"" Apr 17 18:11:08.545591 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.545562 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:11:08.668090 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:08.668056 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zm674"] Apr 17 18:11:08.673047 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:11:08.673008 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4c3a5fb_6090_40b4_b79d_305cd89dd057.slice/crio-93a76ab02ff52fe9e073b99c33b81f8311e07bcfacb9916a7152f6c2bec891cd WatchSource:0}: Error finding container 93a76ab02ff52fe9e073b99c33b81f8311e07bcfacb9916a7152f6c2bec891cd: Status 404 returned error can't find the container with id 93a76ab02ff52fe9e073b99c33b81f8311e07bcfacb9916a7152f6c2bec891cd Apr 17 18:11:09.100915 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:09.100873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:11:09.101128 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:09.100931 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:11:09.101128 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:09.100966 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:11:09.101128 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.101058 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:11:09.101128 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.101073 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:11:09.101128 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.101108 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:11:09.101128 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.101123 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:11:09.101377 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.101153 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:11:41.101133845 +0000 UTC m=+97.873608875 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:11:09.101377 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.101172 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:11:41.101163121 +0000 UTC m=+97.873638160 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:11:09.101377 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.101205 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:11:41.101197694 +0000 UTC m=+97.873672723 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:11:09.145945 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:09.145898 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zm674" event={"ID":"c4c3a5fb-6090-40b4-b79d-305cd89dd057","Type":"ContainerStarted","Data":"93a76ab02ff52fe9e073b99c33b81f8311e07bcfacb9916a7152f6c2bec891cd"} Apr 17 18:11:09.201565 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:09.201515 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:11:09.201750 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.201685 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:11:09.201819 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:09.201773 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:11:41.201750707 +0000 UTC m=+97.974225752 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:11:12.154866 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:12.154831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zm674" event={"ID":"c4c3a5fb-6090-40b4-b79d-305cd89dd057","Type":"ContainerStarted","Data":"37bcd5fd7ff6bbdeddb11c3bcc95c453a8324c0e3df99a59df337767246dbe87"} Apr 17 18:11:12.155268 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:12.154977 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:11:12.169817 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:12.169762 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zm674" podStartSLOduration=66.097195837 podStartE2EDuration="1m9.169745981s" podCreationTimestamp="2026-04-17 18:10:03 +0000 UTC" firstStartedPulling="2026-04-17 18:11:08.675134596 +0000 UTC m=+65.447609627" lastFinishedPulling="2026-04-17 18:11:11.747684724 +0000 UTC m=+68.520159771" observedRunningTime="2026-04-17 18:11:12.168927798 +0000 UTC m=+68.941402871" watchObservedRunningTime="2026-04-17 18:11:12.169745981 +0000 UTC m=+68.942221032" Apr 17 18:11:41.171316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:41.171142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:11:41.171316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:41.171220 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:11:41.171316 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:41.171267 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:11:41.171871 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.171316 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 18:11:41.171871 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.171379 2576 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 17 18:11:41.171871 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.171383 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 18:11:41.171871 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.171403 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-8467bd764f-4w7nb: secret "image-registry-tls" not found Apr 17 18:11:41.171871 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.171414 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls podName:ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e nodeName:}" failed. No retries permitted until 2026-04-17 18:12:45.171394641 +0000 UTC m=+161.943869673 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls") pod "dns-default-7b5fn" (UID: "ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e") : secret "dns-default-metrics-tls" not found Apr 17 18:11:41.171871 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.171436 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert podName:4518d856-34d4-4abd-a245-c368bbffa021 nodeName:}" failed. No retries permitted until 2026-04-17 18:12:45.171423485 +0000 UTC m=+161.943898516 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-rrl6j" (UID: "4518d856-34d4-4abd-a245-c368bbffa021") : secret "networking-console-plugin-cert" not found Apr 17 18:11:41.171871 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.171457 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls podName:cee48042-1c4c-4a0d-b965-f2145692570e nodeName:}" failed. No retries permitted until 2026-04-17 18:12:45.171441806 +0000 UTC m=+161.943916837 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls") pod "image-registry-8467bd764f-4w7nb" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e") : secret "image-registry-tls" not found Apr 17 18:11:41.272632 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:41.272599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:11:41.272783 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.272761 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 18:11:41.272847 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:11:41.272836 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert podName:057fe092-ea34-41d4-a4dc-e565cd2567bf nodeName:}" failed. No retries permitted until 2026-04-17 18:12:45.272817022 +0000 UTC m=+162.045292053 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert") pod "ingress-canary-xmchg" (UID: "057fe092-ea34-41d4-a4dc-e565cd2567bf") : secret "canary-serving-cert" not found Apr 17 18:11:43.160243 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:11:43.160212 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zm674" Apr 17 18:12:12.519664 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:12.519618 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:12:12.520156 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:12:12.519771 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 18:12:12.520156 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:12:12.519854 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs podName:1d2787f0-024b-4888-980e-e458a856a250 nodeName:}" failed. No retries permitted until 2026-04-17 18:14:14.519837408 +0000 UTC m=+251.292312438 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs") pod "network-metrics-daemon-nmf4d" (UID: "1d2787f0-024b-4888-980e-e458a856a250") : secret "metrics-daemon-secret" not found Apr 17 18:12:28.545639 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:28.545611 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fdxn5_c650676f-6713-458e-b337-21b94770e9f5/dns-node-resolver/0.log" Apr 17 18:12:29.750095 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:29.750062 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cm9j8_d2f215a7-d8e6-4b38-bd88-ad6bf1f07470/node-ca/0.log" Apr 17 18:12:40.175621 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:12:40.175572 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" podUID="cee48042-1c4c-4a0d-b965-f2145692570e" Apr 17 18:12:40.224805 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:12:40.224755 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" podUID="4518d856-34d4-4abd-a245-c368bbffa021" Apr 17 18:12:40.231994 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:12:40.231954 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-7b5fn" podUID="ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e" Apr 17 18:12:40.273128 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:12:40.273082 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xmchg" podUID="057fe092-ea34-41d4-a4dc-e565cd2567bf" Apr 17 18:12:40.364511 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:40.364476 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:12:40.364678 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:40.364485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7b5fn" Apr 17 18:12:40.364678 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:40.364484 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:12:41.819966 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:12:41.819912 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-nmf4d" podUID="1d2787f0-024b-4888-980e-e458a856a250" Apr 17 18:12:45.193251 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.193214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:12:45.193759 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.193269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:12:45.193759 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.193322 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:12:45.195916 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.195879 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4518d856-34d4-4abd-a245-c368bbffa021-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-rrl6j\" (UID: \"4518d856-34d4-4abd-a245-c368bbffa021\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:12:45.196062 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.195963 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"image-registry-8467bd764f-4w7nb\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:12:45.196260 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.196240 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e-metrics-tls\") pod \"dns-default-7b5fn\" (UID: \"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e\") " pod="openshift-dns/dns-default-7b5fn" Apr 17 18:12:45.294232 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.294167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:12:45.296891 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.296862 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/057fe092-ea34-41d4-a4dc-e565cd2567bf-cert\") pod \"ingress-canary-xmchg\" (UID: \"057fe092-ea34-41d4-a4dc-e565cd2567bf\") " pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:12:45.469308 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.469211 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-hwgzd\"" Apr 17 18:12:45.469308 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.469228 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-46dtx\"" Apr 17 18:12:45.469308 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.469233 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-wbd55\"" Apr 17 18:12:45.476494 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.476451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xmchg" Apr 17 18:12:45.476494 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.476494 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7b5fn" Apr 17 18:12:45.476629 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.476478 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:12:45.628349 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.628313 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-8467bd764f-4w7nb"] Apr 17 18:12:45.632322 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:12:45.632271 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcee48042_1c4c_4a0d_b965_f2145692570e.slice/crio-cea6c4dac534850c8810472279d186b76aa3fede704d6e74d7c18581eccb7efa WatchSource:0}: Error finding container cea6c4dac534850c8810472279d186b76aa3fede704d6e74d7c18581eccb7efa: Status 404 returned error can't find the container with id cea6c4dac534850c8810472279d186b76aa3fede704d6e74d7c18581eccb7efa Apr 17 18:12:45.641319 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.641290 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7b5fn"] Apr 17 18:12:45.644205 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:12:45.644150 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee9f00fa_ccd7_4b03_8dee_15be1ab71b6e.slice/crio-80e3b9ecee675fc3aa5175d91cc7d10b3b74f605fda13c3608cf56461b24dc70 WatchSource:0}: Error finding container 80e3b9ecee675fc3aa5175d91cc7d10b3b74f605fda13c3608cf56461b24dc70: Status 404 returned error can't find the container with id 80e3b9ecee675fc3aa5175d91cc7d10b3b74f605fda13c3608cf56461b24dc70 Apr 17 18:12:45.656125 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:45.656093 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xmchg"] Apr 17 18:12:45.667367 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:12:45.667338 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod057fe092_ea34_41d4_a4dc_e565cd2567bf.slice/crio-d89bec5a4d3128c76af2d5a65f1b029ff9322cd6c7d48fc8eeb280f824588174 WatchSource:0}: Error finding container d89bec5a4d3128c76af2d5a65f1b029ff9322cd6c7d48fc8eeb280f824588174: Status 404 returned error can't find the container with id d89bec5a4d3128c76af2d5a65f1b029ff9322cd6c7d48fc8eeb280f824588174 Apr 17 18:12:46.380691 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:46.380645 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" event={"ID":"cee48042-1c4c-4a0d-b965-f2145692570e","Type":"ContainerStarted","Data":"066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672"} Apr 17 18:12:46.381212 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:46.380703 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" event={"ID":"cee48042-1c4c-4a0d-b965-f2145692570e","Type":"ContainerStarted","Data":"cea6c4dac534850c8810472279d186b76aa3fede704d6e74d7c18581eccb7efa"} Apr 17 18:12:46.381212 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:46.381030 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:12:46.382546 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:46.382500 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmchg" event={"ID":"057fe092-ea34-41d4-a4dc-e565cd2567bf","Type":"ContainerStarted","Data":"d89bec5a4d3128c76af2d5a65f1b029ff9322cd6c7d48fc8eeb280f824588174"} Apr 17 18:12:46.383959 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:46.383927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7b5fn" event={"ID":"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e","Type":"ContainerStarted","Data":"80e3b9ecee675fc3aa5175d91cc7d10b3b74f605fda13c3608cf56461b24dc70"} Apr 17 18:12:46.405273 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:46.405205 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" podStartSLOduration=162.405167328 podStartE2EDuration="2m42.405167328s" podCreationTimestamp="2026-04-17 18:10:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:12:46.404526603 +0000 UTC m=+163.177001655" watchObservedRunningTime="2026-04-17 18:12:46.405167328 +0000 UTC m=+163.177642379" Apr 17 18:12:47.074305 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.074219 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" podUID="e2c31d05-fcef-4477-8a44-6c7fe9073a7b" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.7:8000/readyz\": dial tcp 10.132.0.7:8000: connect: connection refused" Apr 17 18:12:47.388791 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.388743 2576 generic.go:358] "Generic (PLEG): container finished" podID="68e5f44a-3a60-429a-b683-8a89ca872d0e" containerID="4c6ca93f9328675f4826424ed86e71550abf133acad4974f617192ba66e9a1ba" exitCode=255 Apr 17 18:12:47.389273 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.388826 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" event={"ID":"68e5f44a-3a60-429a-b683-8a89ca872d0e","Type":"ContainerDied","Data":"4c6ca93f9328675f4826424ed86e71550abf133acad4974f617192ba66e9a1ba"} Apr 17 18:12:47.389273 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.389256 2576 scope.go:117] "RemoveContainer" containerID="4c6ca93f9328675f4826424ed86e71550abf133acad4974f617192ba66e9a1ba" Apr 17 18:12:47.390815 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.390784 2576 generic.go:358] "Generic (PLEG): container finished" podID="e2c31d05-fcef-4477-8a44-6c7fe9073a7b" containerID="62c4e261bf2e82ab0af278492fc870328dd4ec7d3b97bacf9364b1884c8e39eb" exitCode=1 Apr 17 18:12:47.390971 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.390818 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" event={"ID":"e2c31d05-fcef-4477-8a44-6c7fe9073a7b","Type":"ContainerDied","Data":"62c4e261bf2e82ab0af278492fc870328dd4ec7d3b97bacf9364b1884c8e39eb"} Apr 17 18:12:47.391358 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.391336 2576 scope.go:117] "RemoveContainer" containerID="62c4e261bf2e82ab0af278492fc870328dd4ec7d3b97bacf9364b1884c8e39eb" Apr 17 18:12:47.460788 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.460754 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:12:47.507325 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.507276 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" Apr 17 18:12:47.527594 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.525623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-vsbmf"] Apr 17 18:12:47.530248 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.530024 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.532908 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.532811 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 18:12:47.532908 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.532880 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-s42m7\"" Apr 17 18:12:47.533173 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.533153 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 18:12:47.533402 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.533382 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 18:12:47.533502 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.533483 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 18:12:47.553502 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.552474 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vsbmf"] Apr 17 18:12:47.613405 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.613279 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.613584 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.613422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-crio-socket\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.613584 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.613465 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.613584 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.613495 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r82vf\" (UniqueName: \"kubernetes.io/projected/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-kube-api-access-r82vf\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.613584 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.613529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-data-volume\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.713859 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.713817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-data-volume\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.714046 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.713869 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.714046 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.713916 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-crio-socket\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.714046 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.713939 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.714046 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.713956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r82vf\" (UniqueName: \"kubernetes.io/projected/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-kube-api-access-r82vf\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.714308 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.714285 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-crio-socket\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.715001 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.714392 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-data-volume\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.715001 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.714852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.717549 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.717494 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.723735 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.723697 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r82vf\" (UniqueName: \"kubernetes.io/projected/3cfcea2b-a665-48c5-b1bc-8840b437d6bc-kube-api-access-r82vf\") pod \"insights-runtime-extractor-vsbmf\" (UID: \"3cfcea2b-a665-48c5-b1bc-8840b437d6bc\") " pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.849360 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.849311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-vsbmf" Apr 17 18:12:47.972449 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:47.972406 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-vsbmf"] Apr 17 18:12:47.977793 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:12:47.977761 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cfcea2b_a665_48c5_b1bc_8840b437d6bc.slice/crio-1da293560ff1e7ef030c175ac26078a9f428b03be233dd556f5661acbe3b9eb5 WatchSource:0}: Error finding container 1da293560ff1e7ef030c175ac26078a9f428b03be233dd556f5661acbe3b9eb5: Status 404 returned error can't find the container with id 1da293560ff1e7ef030c175ac26078a9f428b03be233dd556f5661acbe3b9eb5 Apr 17 18:12:48.395515 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.395478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xmchg" event={"ID":"057fe092-ea34-41d4-a4dc-e565cd2567bf","Type":"ContainerStarted","Data":"eea955fa0058a883672e7a62c0b9f60a18937f0cf17e4c283b4059b337b2257b"} Apr 17 18:12:48.397149 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.397118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7b5fn" event={"ID":"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e","Type":"ContainerStarted","Data":"d4f7c639bcb3208ceae3225c5c1b4b2f6e21eaf7f3294a4f9ac3f5f66e8fe99f"} Apr 17 18:12:48.397292 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.397156 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7b5fn" event={"ID":"ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e","Type":"ContainerStarted","Data":"4369f9efb24cad372a0139cbb845fe1a56deca246eeb389132a6c10d75a76575"} Apr 17 18:12:48.397292 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.397248 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-7b5fn" Apr 17 18:12:48.398712 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.398677 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6498468c77-hdltk" event={"ID":"68e5f44a-3a60-429a-b683-8a89ca872d0e","Type":"ContainerStarted","Data":"c6a9270e8a6de33c8694daa6942d95bc6faffd3591fd25eead364e65cbb2515f"} Apr 17 18:12:48.400238 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.400208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" event={"ID":"e2c31d05-fcef-4477-8a44-6c7fe9073a7b","Type":"ContainerStarted","Data":"5ff2d2111924a28f28ac528f1635486f0ca5d0b0a1bf479f556558ed1542b923"} Apr 17 18:12:48.400400 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.400381 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:12:48.401411 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.401394 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7b8665db88-nsgqh" Apr 17 18:12:48.401567 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.401549 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vsbmf" event={"ID":"3cfcea2b-a665-48c5-b1bc-8840b437d6bc","Type":"ContainerStarted","Data":"88a8f5f5372f814f7028852c52d66a540aaed6f81fa34e29efc8d49c0914facf"} Apr 17 18:12:48.401614 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.401573 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vsbmf" event={"ID":"3cfcea2b-a665-48c5-b1bc-8840b437d6bc","Type":"ContainerStarted","Data":"1da293560ff1e7ef030c175ac26078a9f428b03be233dd556f5661acbe3b9eb5"} Apr 17 18:12:48.412355 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.412303 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xmchg" podStartSLOduration=129.605885097 podStartE2EDuration="2m11.412286697s" podCreationTimestamp="2026-04-17 18:10:37 +0000 UTC" firstStartedPulling="2026-04-17 18:12:45.669839395 +0000 UTC m=+162.442314427" lastFinishedPulling="2026-04-17 18:12:47.476240982 +0000 UTC m=+164.248716027" observedRunningTime="2026-04-17 18:12:48.411481514 +0000 UTC m=+165.183956567" watchObservedRunningTime="2026-04-17 18:12:48.412286697 +0000 UTC m=+165.184761750" Apr 17 18:12:48.447476 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:48.447420 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7b5fn" podStartSLOduration=129.617641351 podStartE2EDuration="2m11.447403605s" podCreationTimestamp="2026-04-17 18:10:37 +0000 UTC" firstStartedPulling="2026-04-17 18:12:45.646228552 +0000 UTC m=+162.418703581" lastFinishedPulling="2026-04-17 18:12:47.475990796 +0000 UTC m=+164.248465835" observedRunningTime="2026-04-17 18:12:48.447009201 +0000 UTC m=+165.219484311" watchObservedRunningTime="2026-04-17 18:12:48.447403605 +0000 UTC m=+165.219878657" Apr 17 18:12:49.406630 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:49.406588 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vsbmf" event={"ID":"3cfcea2b-a665-48c5-b1bc-8840b437d6bc","Type":"ContainerStarted","Data":"7f1809e6d5ff53154c83e7d2fedbe1b2c883e86123cab0021388422aca866e44"} Apr 17 18:12:50.411575 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:50.411462 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-vsbmf" event={"ID":"3cfcea2b-a665-48c5-b1bc-8840b437d6bc","Type":"ContainerStarted","Data":"b32df3b57bebc6c47a349e352f6dd5d28c7e56bdba34dc85e1075dcff1401d76"} Apr 17 18:12:50.430567 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:50.430511 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-vsbmf" podStartSLOduration=1.428446922 podStartE2EDuration="3.430495678s" podCreationTimestamp="2026-04-17 18:12:47 +0000 UTC" firstStartedPulling="2026-04-17 18:12:48.039904527 +0000 UTC m=+164.812379565" lastFinishedPulling="2026-04-17 18:12:50.041953291 +0000 UTC m=+166.814428321" observedRunningTime="2026-04-17 18:12:50.429030763 +0000 UTC m=+167.201505816" watchObservedRunningTime="2026-04-17 18:12:50.430495678 +0000 UTC m=+167.202970729" Apr 17 18:12:52.787386 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:52.787327 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:12:52.789737 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:52.789716 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-tsj6l\"" Apr 17 18:12:52.798531 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:52.798485 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" Apr 17 18:12:52.925506 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:52.925458 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j"] Apr 17 18:12:52.928294 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:12:52.928256 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4518d856_34d4_4abd_a245_c368bbffa021.slice/crio-c65f4ba4259cd3b6a364243d3d312e677f83b9a0f7159903e54bb713c1d6ce37 WatchSource:0}: Error finding container c65f4ba4259cd3b6a364243d3d312e677f83b9a0f7159903e54bb713c1d6ce37: Status 404 returned error can't find the container with id c65f4ba4259cd3b6a364243d3d312e677f83b9a0f7159903e54bb713c1d6ce37 Apr 17 18:12:53.422191 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:53.422150 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" event={"ID":"4518d856-34d4-4abd-a245-c368bbffa021","Type":"ContainerStarted","Data":"c65f4ba4259cd3b6a364243d3d312e677f83b9a0f7159903e54bb713c1d6ce37"} Apr 17 18:12:54.426604 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:54.426560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" event={"ID":"4518d856-34d4-4abd-a245-c368bbffa021","Type":"ContainerStarted","Data":"20308c783abc91a00fdea7d2d527a7380fd1a6736cce75eb6a18e33a54ab9358"} Apr 17 18:12:54.441039 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:54.440984 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-rrl6j" podStartSLOduration=167.551795854 podStartE2EDuration="2m48.440968824s" podCreationTimestamp="2026-04-17 18:10:06 +0000 UTC" firstStartedPulling="2026-04-17 18:12:52.930337494 +0000 UTC m=+169.702812532" lastFinishedPulling="2026-04-17 18:12:53.819510455 +0000 UTC m=+170.591985502" observedRunningTime="2026-04-17 18:12:54.439951855 +0000 UTC m=+171.212426911" watchObservedRunningTime="2026-04-17 18:12:54.440968824 +0000 UTC m=+171.213443854" Apr 17 18:12:56.787041 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:56.786991 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:12:58.409492 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:12:58.409457 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7b5fn" Apr 17 18:13:05.481431 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:05.481387 2576 patch_prober.go:28] interesting pod/image-registry-8467bd764f-4w7nb container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 18:13:05.481956 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:05.481480 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" podUID="cee48042-1c4c-4a0d-b965-f2145692570e" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 18:13:07.395742 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:07.395700 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:13:09.173014 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:09.172974 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8467bd764f-4w7nb"] Apr 17 18:13:10.114716 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.114679 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-758sl"] Apr 17 18:13:10.121911 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.121878 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.124377 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.124353 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 18:13:10.124548 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.124499 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 18:13:10.124548 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.124525 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 18:13:10.125395 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.125375 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 18:13:10.125395 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.125395 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 18:13:10.125686 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.125441 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f528l\"" Apr 17 18:13:10.125686 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.125398 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 18:13:10.190076 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190028 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-textfile\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190076 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-root\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190524 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190106 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f76e4a-b5c4-4066-9388-6997cc8391ba-metrics-client-ca\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190524 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190159 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190524 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190217 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n62j\" (UniqueName: \"kubernetes.io/projected/40f76e4a-b5c4-4066-9388-6997cc8391ba-kube-api-access-2n62j\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190524 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190237 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-tls\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190524 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190296 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-sys\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190524 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-wtmp\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.190524 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.190398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-accelerators-collector-config\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291147 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291113 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2n62j\" (UniqueName: \"kubernetes.io/projected/40f76e4a-b5c4-4066-9388-6997cc8391ba-kube-api-access-2n62j\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291147 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291152 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-tls\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291199 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-sys\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-wtmp\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-accelerators-collector-config\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:13:10.291289 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291309 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-textfile\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291344 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-root\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291349 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-sys\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:13:10.291366 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-tls podName:40f76e4a-b5c4-4066-9388-6997cc8391ba nodeName:}" failed. No retries permitted until 2026-04-17 18:13:10.791343573 +0000 UTC m=+187.563818604 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-tls") pod "node-exporter-758sl" (UID: "40f76e4a-b5c4-4066-9388-6997cc8391ba") : secret "node-exporter-tls" not found Apr 17 18:13:10.291384 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291382 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-root\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291797 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291402 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f76e4a-b5c4-4066-9388-6997cc8391ba-metrics-client-ca\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291797 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291439 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291797 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-wtmp\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.291797 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.291712 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-textfile\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.292044 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.292017 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f76e4a-b5c4-4066-9388-6997cc8391ba-metrics-client-ca\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.292349 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.292333 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-accelerators-collector-config\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.293967 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.293942 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.299938 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.299903 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n62j\" (UniqueName: \"kubernetes.io/projected/40f76e4a-b5c4-4066-9388-6997cc8391ba-kube-api-access-2n62j\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.796513 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.796473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-tls\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:10.799096 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:10.799066 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/40f76e4a-b5c4-4066-9388-6997cc8391ba-node-exporter-tls\") pod \"node-exporter-758sl\" (UID: \"40f76e4a-b5c4-4066-9388-6997cc8391ba\") " pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:11.032811 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:11.032771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-758sl" Apr 17 18:13:11.041812 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:13:11.041782 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f76e4a_b5c4_4066_9388_6997cc8391ba.slice/crio-eddd735eeb604f02c93f6a7bfa67a1a4d89fc4b5db1fd7fe91acc1309cdd5110 WatchSource:0}: Error finding container eddd735eeb604f02c93f6a7bfa67a1a4d89fc4b5db1fd7fe91acc1309cdd5110: Status 404 returned error can't find the container with id eddd735eeb604f02c93f6a7bfa67a1a4d89fc4b5db1fd7fe91acc1309cdd5110 Apr 17 18:13:11.476334 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:11.475282 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-758sl" event={"ID":"40f76e4a-b5c4-4066-9388-6997cc8391ba","Type":"ContainerStarted","Data":"eddd735eeb604f02c93f6a7bfa67a1a4d89fc4b5db1fd7fe91acc1309cdd5110"} Apr 17 18:13:12.479298 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:12.479259 2576 generic.go:358] "Generic (PLEG): container finished" podID="40f76e4a-b5c4-4066-9388-6997cc8391ba" containerID="2c8c464b111214307e0c1cd3a6dc653e0db4cdf8d98cc028d60e1abec76f8d58" exitCode=0 Apr 17 18:13:12.479685 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:12.479342 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-758sl" event={"ID":"40f76e4a-b5c4-4066-9388-6997cc8391ba","Type":"ContainerDied","Data":"2c8c464b111214307e0c1cd3a6dc653e0db4cdf8d98cc028d60e1abec76f8d58"} Apr 17 18:13:13.483752 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:13.483711 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-758sl" event={"ID":"40f76e4a-b5c4-4066-9388-6997cc8391ba","Type":"ContainerStarted","Data":"5e56add57d89fd31931aa537fb2fff5066dcdd51aa9680cae2495a05e00b067f"} Apr 17 18:13:13.483752 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:13.483751 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-758sl" event={"ID":"40f76e4a-b5c4-4066-9388-6997cc8391ba","Type":"ContainerStarted","Data":"23a2f4c3493dfa171280dca7b1ccf917d899d3fed0bef817dc13e552233374fe"} Apr 17 18:13:13.501444 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:13.501391 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-758sl" podStartSLOduration=2.753110826 podStartE2EDuration="3.501374523s" podCreationTimestamp="2026-04-17 18:13:10 +0000 UTC" firstStartedPulling="2026-04-17 18:13:11.043681309 +0000 UTC m=+187.816156341" lastFinishedPulling="2026-04-17 18:13:11.791945 +0000 UTC m=+188.564420038" observedRunningTime="2026-04-17 18:13:13.501132425 +0000 UTC m=+190.273607478" watchObservedRunningTime="2026-04-17 18:13:13.501374523 +0000 UTC m=+190.273849577" Apr 17 18:13:34.192432 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.192391 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" podUID="cee48042-1c4c-4a0d-b965-f2145692570e" containerName="registry" containerID="cri-o://066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672" gracePeriod=30 Apr 17 18:13:34.444968 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.444895 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:13:34.491811 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.491768 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.491811 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.491815 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-bound-sa-token\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.492047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.491852 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-trusted-ca\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.492047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.491889 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-image-registry-private-configuration\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.492047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.491911 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-registry-certificates\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.492047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.491930 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-installation-pull-secrets\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.492047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.491990 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cee48042-1c4c-4a0d-b965-f2145692570e-ca-trust-extracted\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.492047 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.492022 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97bzj\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-kube-api-access-97bzj\") pod \"cee48042-1c4c-4a0d-b965-f2145692570e\" (UID: \"cee48042-1c4c-4a0d-b965-f2145692570e\") " Apr 17 18:13:34.492353 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.492320 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:34.492571 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.492536 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 18:13:34.494651 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.494618 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:13:34.494769 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.494672 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:13:34.494769 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.494670 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:13:34.494867 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.494830 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 18:13:34.494979 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.494962 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-kube-api-access-97bzj" (OuterVolumeSpecName: "kube-api-access-97bzj") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "kube-api-access-97bzj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 18:13:34.501452 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.501408 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cee48042-1c4c-4a0d-b965-f2145692570e-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cee48042-1c4c-4a0d-b965-f2145692570e" (UID: "cee48042-1c4c-4a0d-b965-f2145692570e"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 18:13:34.537338 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.537304 2576 generic.go:358] "Generic (PLEG): container finished" podID="cee48042-1c4c-4a0d-b965-f2145692570e" containerID="066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672" exitCode=0 Apr 17 18:13:34.537484 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.537372 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" Apr 17 18:13:34.537484 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.537395 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" event={"ID":"cee48042-1c4c-4a0d-b965-f2145692570e","Type":"ContainerDied","Data":"066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672"} Apr 17 18:13:34.537484 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.537437 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-8467bd764f-4w7nb" event={"ID":"cee48042-1c4c-4a0d-b965-f2145692570e","Type":"ContainerDied","Data":"cea6c4dac534850c8810472279d186b76aa3fede704d6e74d7c18581eccb7efa"} Apr 17 18:13:34.537484 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.537456 2576 scope.go:117] "RemoveContainer" containerID="066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672" Apr 17 18:13:34.546259 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.546232 2576 scope.go:117] "RemoveContainer" containerID="066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672" Apr 17 18:13:34.546671 ip-10-0-134-83 kubenswrapper[2576]: E0417 18:13:34.546650 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672\": container with ID starting with 066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672 not found: ID does not exist" containerID="066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672" Apr 17 18:13:34.546717 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.546681 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672"} err="failed to get container status \"066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672\": rpc error: code = NotFound desc = could not find container \"066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672\": container with ID starting with 066e6702f79cdb4d26527746bfcd1c265658437642db4b8d62fbf4aa2057a672 not found: ID does not exist" Apr 17 18:13:34.556981 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.556948 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-8467bd764f-4w7nb"] Apr 17 18:13:34.560788 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.560757 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-8467bd764f-4w7nb"] Apr 17 18:13:34.593085 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593057 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-trusted-ca\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:34.593085 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593085 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-image-registry-private-configuration\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:34.593274 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593099 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cee48042-1c4c-4a0d-b965-f2145692570e-registry-certificates\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:34.593274 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593112 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cee48042-1c4c-4a0d-b965-f2145692570e-installation-pull-secrets\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:34.593274 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593121 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cee48042-1c4c-4a0d-b965-f2145692570e-ca-trust-extracted\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:34.593274 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593129 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-97bzj\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-kube-api-access-97bzj\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:34.593274 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593138 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-registry-tls\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:34.593274 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:34.593147 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cee48042-1c4c-4a0d-b965-f2145692570e-bound-sa-token\") on node \"ip-10-0-134-83.ec2.internal\" DevicePath \"\"" Apr 17 18:13:35.790848 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:35.790805 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cee48042-1c4c-4a0d-b965-f2145692570e" path="/var/lib/kubelet/pods/cee48042-1c4c-4a0d-b965-f2145692570e/volumes" Apr 17 18:13:37.500902 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:37.500858 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" podUID="a2d45c46-4c77-4525-aba9-4863a1b296ee" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 18:13:47.500780 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:47.500738 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" podUID="a2d45c46-4c77-4525-aba9-4863a1b296ee" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 18:13:57.500425 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:57.500381 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" podUID="a2d45c46-4c77-4525-aba9-4863a1b296ee" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 17 18:13:57.500882 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:57.500452 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" Apr 17 18:13:57.500943 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:57.500926 2576 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"ee05c669224f2177e464c029a19134296a8c435a57e4ca0b02bfc537e9a249d2"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 17 18:13:57.500981 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:57.500963 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" podUID="a2d45c46-4c77-4525-aba9-4863a1b296ee" containerName="service-proxy" containerID="cri-o://ee05c669224f2177e464c029a19134296a8c435a57e4ca0b02bfc537e9a249d2" gracePeriod=30 Apr 17 18:13:58.609084 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:58.609050 2576 generic.go:358] "Generic (PLEG): container finished" podID="a2d45c46-4c77-4525-aba9-4863a1b296ee" containerID="ee05c669224f2177e464c029a19134296a8c435a57e4ca0b02bfc537e9a249d2" exitCode=2 Apr 17 18:13:58.609495 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:58.609118 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" event={"ID":"a2d45c46-4c77-4525-aba9-4863a1b296ee","Type":"ContainerDied","Data":"ee05c669224f2177e464c029a19134296a8c435a57e4ca0b02bfc537e9a249d2"} Apr 17 18:13:58.609495 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:13:58.609145 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-f6fb56456-f6skf" event={"ID":"a2d45c46-4c77-4525-aba9-4863a1b296ee","Type":"ContainerStarted","Data":"7d65c0d74d3fa55d70d87e1b8da0a9b7440562a54244f1a687111543508e9465"} Apr 17 18:14:14.592825 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:14.592776 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:14:14.595376 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:14.595351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1d2787f0-024b-4888-980e-e458a856a250-metrics-certs\") pod \"network-metrics-daemon-nmf4d\" (UID: \"1d2787f0-024b-4888-980e-e458a856a250\") " pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:14:14.790383 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:14.790349 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-c85k4\"" Apr 17 18:14:14.797912 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:14.797875 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-nmf4d" Apr 17 18:14:14.933829 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:14.933794 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-nmf4d"] Apr 17 18:14:14.936884 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:14:14.936847 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d2787f0_024b_4888_980e_e458a856a250.slice/crio-7582e20c145e2df1ac3f5864bad40b7cd0250e80fd13730fbd5acea24cf17f7b WatchSource:0}: Error finding container 7582e20c145e2df1ac3f5864bad40b7cd0250e80fd13730fbd5acea24cf17f7b: Status 404 returned error can't find the container with id 7582e20c145e2df1ac3f5864bad40b7cd0250e80fd13730fbd5acea24cf17f7b Apr 17 18:14:15.653511 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:15.653471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nmf4d" event={"ID":"1d2787f0-024b-4888-980e-e458a856a250","Type":"ContainerStarted","Data":"7582e20c145e2df1ac3f5864bad40b7cd0250e80fd13730fbd5acea24cf17f7b"} Apr 17 18:14:16.658850 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:16.658812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nmf4d" event={"ID":"1d2787f0-024b-4888-980e-e458a856a250","Type":"ContainerStarted","Data":"78afa0ddb195146dbefa661e66876a27e1cd30c9ac55feb7e74a0450e4a68e6a"} Apr 17 18:14:16.658850 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:16.658855 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-nmf4d" event={"ID":"1d2787f0-024b-4888-980e-e458a856a250","Type":"ContainerStarted","Data":"1dc9689ae590b904b555e62e1744efde1d4492781a96a5bbdbce85002306f72d"} Apr 17 18:14:16.679406 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:14:16.679348 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-nmf4d" podStartSLOduration=252.593116098 podStartE2EDuration="4m13.679331973s" podCreationTimestamp="2026-04-17 18:10:03 +0000 UTC" firstStartedPulling="2026-04-17 18:14:14.938869684 +0000 UTC m=+251.711344729" lastFinishedPulling="2026-04-17 18:14:16.025085571 +0000 UTC m=+252.797560604" observedRunningTime="2026-04-17 18:14:16.6772856 +0000 UTC m=+253.449760654" watchObservedRunningTime="2026-04-17 18:14:16.679331973 +0000 UTC m=+253.451807025" Apr 17 18:15:03.641876 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:15:03.641845 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 18:17:53.195955 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.195918 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km"] Apr 17 18:17:53.196430 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.196242 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cee48042-1c4c-4a0d-b965-f2145692570e" containerName="registry" Apr 17 18:17:53.196430 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.196255 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="cee48042-1c4c-4a0d-b965-f2145692570e" containerName="registry" Apr 17 18:17:53.196430 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.196302 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="cee48042-1c4c-4a0d-b965-f2145692570e" containerName="registry" Apr 17 18:17:53.198905 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.198889 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.202450 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.202421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"openshift-service-ca.crt\"" Apr 17 18:17:53.202610 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.202456 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"webhook-server-cert\"" Apr 17 18:17:53.202610 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.202486 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"jobset-manager-config\"" Apr 17 18:17:53.202610 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.202459 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"jobset-controller-manager-dockercfg-pqpcn\"" Apr 17 18:17:53.202610 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.202461 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-jobset-operator\"/\"kube-root-ca.crt\"" Apr 17 18:17:53.202610 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.202459 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-jobset-operator\"/\"metrics-server-cert\"" Apr 17 18:17:53.207110 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.207089 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km"] Apr 17 18:17:53.247228 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.247165 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c784087-3cef-4005-b0af-221b13da7c93-metrics-certs\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.247403 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.247262 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4c784087-3cef-4005-b0af-221b13da7c93-manager-config\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.247403 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.247285 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmmq\" (UniqueName: \"kubernetes.io/projected/4c784087-3cef-4005-b0af-221b13da7c93-kube-api-access-hkmmq\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.247403 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.247312 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c784087-3cef-4005-b0af-221b13da7c93-cert\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.348207 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.348144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c784087-3cef-4005-b0af-221b13da7c93-metrics-certs\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.348412 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.348234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4c784087-3cef-4005-b0af-221b13da7c93-manager-config\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.348412 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.348262 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmmq\" (UniqueName: \"kubernetes.io/projected/4c784087-3cef-4005-b0af-221b13da7c93-kube-api-access-hkmmq\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.348412 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.348300 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c784087-3cef-4005-b0af-221b13da7c93-cert\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.349441 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.349421 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/4c784087-3cef-4005-b0af-221b13da7c93-manager-config\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.350822 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.350792 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c784087-3cef-4005-b0af-221b13da7c93-cert\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.350926 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.350825 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4c784087-3cef-4005-b0af-221b13da7c93-metrics-certs\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.357318 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.357274 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmmq\" (UniqueName: \"kubernetes.io/projected/4c784087-3cef-4005-b0af-221b13da7c93-kube-api-access-hkmmq\") pod \"jobset-controller-manager-744549c867-5d5km\" (UID: \"4c784087-3cef-4005-b0af-221b13da7c93\") " pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.510737 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.510636 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:53.637648 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.637613 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km"] Apr 17 18:17:53.641116 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:17:53.641086 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c784087_3cef_4005_b0af_221b13da7c93.slice/crio-1c2cb89363f389cd3f5f01fbb3c37cc9954ba3124cdfea88292efc84df071eb2 WatchSource:0}: Error finding container 1c2cb89363f389cd3f5f01fbb3c37cc9954ba3124cdfea88292efc84df071eb2: Status 404 returned error can't find the container with id 1c2cb89363f389cd3f5f01fbb3c37cc9954ba3124cdfea88292efc84df071eb2 Apr 17 18:17:53.642971 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:53.642954 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 18:17:54.232234 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:54.232198 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" event={"ID":"4c784087-3cef-4005-b0af-221b13da7c93","Type":"ContainerStarted","Data":"1c2cb89363f389cd3f5f01fbb3c37cc9954ba3124cdfea88292efc84df071eb2"} Apr 17 18:17:57.240877 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:57.240844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" event={"ID":"4c784087-3cef-4005-b0af-221b13da7c93","Type":"ContainerStarted","Data":"fa1b97b4ec393942e74fbc3025a92b5d4bfe3abc5e87a88b2c0b867efd1f3c96"} Apr 17 18:17:57.241346 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:57.240957 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:17:57.258093 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:17:57.258024 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" podStartSLOduration=1.278140518 podStartE2EDuration="4.257999766s" podCreationTimestamp="2026-04-17 18:17:53 +0000 UTC" firstStartedPulling="2026-04-17 18:17:53.64311119 +0000 UTC m=+470.415586222" lastFinishedPulling="2026-04-17 18:17:56.622970437 +0000 UTC m=+473.395445470" observedRunningTime="2026-04-17 18:17:57.2571445 +0000 UTC m=+474.029619552" watchObservedRunningTime="2026-04-17 18:17:57.257999766 +0000 UTC m=+474.030474819" Apr 17 18:18:08.249509 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:18:08.249474 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-jobset-operator/jobset-controller-manager-744549c867-5d5km" Apr 17 18:21:00.204559 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:00.204527 2576 ???:1] "http: TLS handshake error from 10.0.134.83:45248: EOF" Apr 17 18:21:00.208109 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:00.208080 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-nns8d_8d7c622e-a8b2-4ec5-b59a-62c39c3285bd/global-pull-secret-syncer/0.log" Apr 17 18:21:00.324392 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:00.324360 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rpssj_0473a77b-a212-4e59-806b-bcef01945958/konnectivity-agent/0.log" Apr 17 18:21:00.431044 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:00.431014 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-134-83.ec2.internal_b6dc9341d6e3a6c508cdf128e57395dc/haproxy/0.log" Apr 17 18:21:03.552085 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:03.552051 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-758sl_40f76e4a-b5c4-4066-9388-6997cc8391ba/node-exporter/0.log" Apr 17 18:21:03.581450 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:03.581423 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-758sl_40f76e4a-b5c4-4066-9388-6997cc8391ba/kube-rbac-proxy/0.log" Apr 17 18:21:03.606923 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:03.606895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-758sl_40f76e4a-b5c4-4066-9388-6997cc8391ba/init-textfile/0.log" Apr 17 18:21:05.626087 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:05.626050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-rrl6j_4518d856-34d4-4abd-a245-c368bbffa021/networking-console-plugin/0.log" Apr 17 18:21:06.589918 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.589877 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l"] Apr 17 18:21:06.593015 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.592988 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.595492 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.595460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9zls8\"/\"kube-root-ca.crt\"" Apr 17 18:21:06.596107 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.596088 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-9zls8\"/\"openshift-service-ca.crt\"" Apr 17 18:21:06.596219 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.596098 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-9zls8\"/\"default-dockercfg-49jmw\"" Apr 17 18:21:06.603134 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.603095 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l"] Apr 17 18:21:06.641779 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.641734 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-podres\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.642211 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.641832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-sys\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.642211 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.641858 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-proc\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.642211 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.641886 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-lib-modules\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.642211 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.641903 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkg65\" (UniqueName: \"kubernetes.io/projected/48bf3fc6-413c-428c-8733-09440b7df730-kube-api-access-tkg65\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742441 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742398 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-podres\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-sys\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-proc\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742516 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-lib-modules\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742542 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkg65\" (UniqueName: \"kubernetes.io/projected/48bf3fc6-413c-428c-8733-09440b7df730-kube-api-access-tkg65\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742595 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-sys\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742599 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-podres\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742622 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-proc\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.742920 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.742670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48bf3fc6-413c-428c-8733-09440b7df730-lib-modules\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.751166 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.751123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkg65\" (UniqueName: \"kubernetes.io/projected/48bf3fc6-413c-428c-8733-09440b7df730-kube-api-access-tkg65\") pod \"perf-node-gather-daemonset-48r9l\" (UID: \"48bf3fc6-413c-428c-8733-09440b7df730\") " pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:06.904226 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:06.904103 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:07.032690 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.032658 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l"] Apr 17 18:21:07.036329 ip-10-0-134-83 kubenswrapper[2576]: W0417 18:21:07.036283 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod48bf3fc6_413c_428c_8733_09440b7df730.slice/crio-3c0fcdf0984fcfb411151575dfdac6b3150b9c6d29de9b2e76a765b58a1b28c5 WatchSource:0}: Error finding container 3c0fcdf0984fcfb411151575dfdac6b3150b9c6d29de9b2e76a765b58a1b28c5: Status 404 returned error can't find the container with id 3c0fcdf0984fcfb411151575dfdac6b3150b9c6d29de9b2e76a765b58a1b28c5 Apr 17 18:21:07.490619 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.490536 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7b5fn_ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e/dns/0.log" Apr 17 18:21:07.514533 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.514508 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-7b5fn_ee9f00fa-ccd7-4b03-8dee-15be1ab71b6e/kube-rbac-proxy/0.log" Apr 17 18:21:07.674197 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.674148 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-fdxn5_c650676f-6713-458e-b337-21b94770e9f5/dns-node-resolver/0.log" Apr 17 18:21:07.748788 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.748693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" event={"ID":"48bf3fc6-413c-428c-8733-09440b7df730","Type":"ContainerStarted","Data":"a5a59ba7f1fd1d3ad50a3c32c8da2a9ae5fb5642f899267931e3eb1ede339945"} Apr 17 18:21:07.748788 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.748729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" event={"ID":"48bf3fc6-413c-428c-8733-09440b7df730","Type":"ContainerStarted","Data":"3c0fcdf0984fcfb411151575dfdac6b3150b9c6d29de9b2e76a765b58a1b28c5"} Apr 17 18:21:07.749002 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.748869 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:07.765551 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:07.765495 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" podStartSLOduration=1.765480537 podStartE2EDuration="1.765480537s" podCreationTimestamp="2026-04-17 18:21:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 18:21:07.76495947 +0000 UTC m=+664.537434516" watchObservedRunningTime="2026-04-17 18:21:07.765480537 +0000 UTC m=+664.537955579" Apr 17 18:21:08.178359 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:08.178320 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-cm9j8_d2f215a7-d8e6-4b38-bd88-ad6bf1f07470/node-ca/0.log" Apr 17 18:21:09.229305 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:09.229272 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xmchg_057fe092-ea34-41d4-a4dc-e565cd2567bf/serve-healthcheck-canary/0.log" Apr 17 18:21:09.731820 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:09.731788 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vsbmf_3cfcea2b-a665-48c5-b1bc-8840b437d6bc/kube-rbac-proxy/0.log" Apr 17 18:21:09.753836 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:09.753805 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vsbmf_3cfcea2b-a665-48c5-b1bc-8840b437d6bc/exporter/0.log" Apr 17 18:21:09.778380 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:09.778353 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-vsbmf_3cfcea2b-a665-48c5-b1bc-8840b437d6bc/extractor/0.log" Apr 17 18:21:11.344086 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:11.344056 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-jobset-operator_jobset-controller-manager-744549c867-5d5km_4c784087-3cef-4005-b0af-221b13da7c93/manager/0.log" Apr 17 18:21:13.761817 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:13.761787 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-9zls8/perf-node-gather-daemonset-48r9l" Apr 17 18:21:16.104947 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.104917 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-67qdn_3728df74-67ba-42d4-88b0-a83eca2d9e0f/kube-multus/0.log" Apr 17 18:21:16.606283 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.606214 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhcv2_8ff23a89-3da2-420c-a6f3-bf94173e14c7/kube-multus-additional-cni-plugins/0.log" Apr 17 18:21:16.629034 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.629009 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhcv2_8ff23a89-3da2-420c-a6f3-bf94173e14c7/egress-router-binary-copy/0.log" Apr 17 18:21:16.653835 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.653803 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhcv2_8ff23a89-3da2-420c-a6f3-bf94173e14c7/cni-plugins/0.log" Apr 17 18:21:16.680329 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.680301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhcv2_8ff23a89-3da2-420c-a6f3-bf94173e14c7/bond-cni-plugin/0.log" Apr 17 18:21:16.706671 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.706640 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhcv2_8ff23a89-3da2-420c-a6f3-bf94173e14c7/routeoverride-cni/0.log" Apr 17 18:21:16.731729 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.731695 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhcv2_8ff23a89-3da2-420c-a6f3-bf94173e14c7/whereabouts-cni-bincopy/0.log" Apr 17 18:21:16.754288 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.754254 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhcv2_8ff23a89-3da2-420c-a6f3-bf94173e14c7/whereabouts-cni/0.log" Apr 17 18:21:16.832096 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.832064 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nmf4d_1d2787f0-024b-4888-980e-e458a856a250/network-metrics-daemon/0.log" Apr 17 18:21:16.854982 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:16.854947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-nmf4d_1d2787f0-024b-4888-980e-e458a856a250/kube-rbac-proxy/0.log" Apr 17 18:21:17.705987 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.705958 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/ovn-controller/0.log" Apr 17 18:21:17.732139 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.732105 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/ovn-acl-logging/0.log" Apr 17 18:21:17.771270 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.771241 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/kube-rbac-proxy-node/0.log" Apr 17 18:21:17.792931 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.792899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 18:21:17.816405 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.816334 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/northd/0.log" Apr 17 18:21:17.840518 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.840496 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/nbdb/0.log" Apr 17 18:21:17.866435 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.866410 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/sbdb/0.log" Apr 17 18:21:17.961783 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:17.961746 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5w6tv_0c89c182-082a-4aa5-95e4-40fa3cd0c63d/ovnkube-controller/0.log" Apr 17 18:21:19.765291 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:19.765256 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zm674_c4c3a5fb-6090-40b4-b79d-305cd89dd057/network-check-target-container/0.log" Apr 17 18:21:20.817615 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:20.817587 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-7bztg_a87beab3-c9c8-4166-b1ea-18aacdaa1b02/iptables-alerter/0.log" Apr 17 18:21:21.511853 ip-10-0-134-83 kubenswrapper[2576]: I0417 18:21:21.511815 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-v86sj_7949a229-9d6d-445d-938c-4147dc073aaf/tuned/0.log"