Apr 24 19:06:37.244272 ip-10-0-138-142 systemd[1]: Starting Kubernetes Kubelet... Apr 24 19:06:37.720301 ip-10-0-138-142 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:37.720301 ip-10-0-138-142 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 19:06:37.720301 ip-10-0-138-142 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:37.720301 ip-10-0-138-142 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 19:06:37.720301 ip-10-0-138-142 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:37.721649 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.720993 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 19:06:37.725092 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725067 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.725092 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725086 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.725092 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725090 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.725092 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725096 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.725092 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725100 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725104 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725108 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725111 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725118 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725123 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725126 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725130 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725133 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725137 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725141 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725144 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725149 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725159 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725163 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725168 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725171 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725175 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725180 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725184 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.725398 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725187 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725192 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725196 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725199 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725205 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725212 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725216 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725220 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725225 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725229 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725233 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725238 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725243 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725248 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725253 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725258 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725263 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725267 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725273 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.726218 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725277 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725281 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725285 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725289 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725293 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725297 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725302 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725306 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725310 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725314 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725318 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725322 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725327 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725331 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725335 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725339 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725343 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725347 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725351 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725355 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.726866 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725359 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725365 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725370 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725374 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725378 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725382 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725386 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725392 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725397 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725400 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725405 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725409 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725414 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725418 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725422 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725426 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725430 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725434 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725439 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725443 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.727486 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725447 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725451 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.725455 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726123 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726133 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726137 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726141 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726145 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726150 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726155 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726160 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726164 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726168 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726172 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726176 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726181 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726186 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726190 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726194 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726198 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.728353 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726203 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726207 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726211 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726215 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726220 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726224 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726228 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726232 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726236 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726240 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726244 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726248 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726253 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726257 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726261 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726266 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726270 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726274 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726279 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726283 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.729137 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726291 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726296 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726300 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726305 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726309 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726313 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726318 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726322 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726329 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726334 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726339 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726344 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726348 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726352 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726356 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726362 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726368 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726373 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.729787 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726378 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726382 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726386 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726392 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726396 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726400 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726404 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726408 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726412 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726416 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726420 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726424 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726428 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726433 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726438 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726442 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726446 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726450 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726454 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726459 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.730369 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726463 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726467 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726471 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726476 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726480 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726484 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726487 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726491 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726496 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726500 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.726504 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727244 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727261 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727271 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727278 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727285 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727290 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727298 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727306 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727311 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727316 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 19:06:37.730862 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727322 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727327 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727332 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727337 2565 flags.go:64] FLAG: --cgroup-root="" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727349 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727354 2565 flags.go:64] FLAG: --client-ca-file="" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727358 2565 flags.go:64] FLAG: --cloud-config="" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727363 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727369 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727412 2565 flags.go:64] FLAG: --cluster-domain="" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727418 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727424 2565 flags.go:64] FLAG: --config-dir="" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727429 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727435 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727445 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727450 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727456 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727462 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727467 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727472 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727476 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727482 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727486 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727493 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727498 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 19:06:37.731400 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727503 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727507 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727512 2565 flags.go:64] FLAG: --enable-server="true" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727517 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727524 2565 flags.go:64] FLAG: --event-burst="100" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727530 2565 flags.go:64] FLAG: --event-qps="50" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727535 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727540 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727545 2565 flags.go:64] FLAG: --eviction-hard="" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727551 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727555 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727561 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727567 2565 flags.go:64] FLAG: --eviction-soft="" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727571 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727576 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727581 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727587 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727591 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727596 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727601 2565 flags.go:64] FLAG: --feature-gates="" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727607 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727612 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727617 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727623 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727628 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 24 19:06:37.732146 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727632 2565 flags.go:64] FLAG: --help="false" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727637 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-138-142.ec2.internal" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727643 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727648 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727653 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727658 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727664 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727670 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727674 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727679 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727684 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727689 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727694 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727699 2565 flags.go:64] FLAG: --kube-reserved="" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727704 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727708 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727713 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727718 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727723 2565 flags.go:64] FLAG: --lock-file="" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727728 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727734 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727739 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727748 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 19:06:37.732744 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727753 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727758 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727762 2565 flags.go:64] FLAG: --logging-format="text" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727768 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727775 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727779 2565 flags.go:64] FLAG: --manifest-url="" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727784 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727792 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727796 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727804 2565 flags.go:64] FLAG: --max-pods="110" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727809 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727814 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727818 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727823 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727829 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727834 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727838 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727852 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727857 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727862 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727867 2565 flags.go:64] FLAG: --pod-cidr="" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727872 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727882 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727887 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 19:06:37.733345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727892 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727897 2565 flags.go:64] FLAG: --port="10250" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727902 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727907 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-03875064aeeb72fc6" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727913 2565 flags.go:64] FLAG: --qos-reserved="" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727918 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727923 2565 flags.go:64] FLAG: --register-node="true" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727928 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727934 2565 flags.go:64] FLAG: --register-with-taints="" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727940 2565 flags.go:64] FLAG: --registry-burst="10" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727945 2565 flags.go:64] FLAG: --registry-qps="5" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727949 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727954 2565 flags.go:64] FLAG: --reserved-memory="" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727960 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727965 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727985 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727991 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.727996 2565 flags.go:64] FLAG: --runonce="false" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728001 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728006 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728011 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728016 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728020 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728026 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728031 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728036 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 19:06:37.733952 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728042 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728047 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728052 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728056 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728061 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728066 2565 flags.go:64] FLAG: --system-cgroups="" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728071 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728081 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728085 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728100 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728108 2565 flags.go:64] FLAG: --tls-min-version="" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728112 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728117 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728122 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728127 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728132 2565 flags.go:64] FLAG: --v="2" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728139 2565 flags.go:64] FLAG: --version="false" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728146 2565 flags.go:64] FLAG: --vmodule="" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728153 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728159 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728342 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728349 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728354 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728358 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.734602 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728364 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728369 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728374 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728378 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728382 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728387 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728391 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728395 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728400 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728412 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728417 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728421 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728425 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728429 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728434 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728438 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728443 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728449 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728455 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.735292 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728459 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728463 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728467 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728471 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728475 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728480 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728484 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728488 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728492 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728496 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728501 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728505 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728512 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728538 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728543 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728548 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728552 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728557 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728561 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.735790 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728568 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728573 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728577 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728582 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728594 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728599 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728603 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728608 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728612 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728616 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728620 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728629 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728635 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728639 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728643 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728648 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728652 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728656 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728660 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728664 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.736658 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728668 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728673 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728677 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728682 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728686 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728690 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728694 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728699 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728703 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728707 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728711 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728715 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728719 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728723 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728727 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728732 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728736 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728748 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728752 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728757 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.737352 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728761 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728765 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728770 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.728776 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.728785 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.737640 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.737658 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737712 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737718 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737722 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737727 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737732 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737735 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737738 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737741 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.737851 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737744 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737746 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737749 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737752 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737755 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737758 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737760 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737763 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737766 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737768 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737771 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737774 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737777 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737780 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737783 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737785 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737788 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737790 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737794 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737798 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.738265 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737801 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737804 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737806 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737810 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737812 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737815 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737818 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737821 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737823 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737826 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737829 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737832 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737834 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737837 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737839 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737842 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737845 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737848 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737850 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737853 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.738758 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737856 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737859 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737862 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737865 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737867 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737870 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737872 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737875 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737878 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737880 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737883 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737886 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737889 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737891 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737894 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737897 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737901 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737903 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737906 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737909 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.739313 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737911 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737914 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737917 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737920 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737922 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737924 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737927 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737929 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737933 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737935 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737938 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737941 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737944 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737946 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737949 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737952 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737954 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.739796 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.737956 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.737962 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738079 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738085 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738089 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738092 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738095 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738099 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738102 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738105 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738107 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738110 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738112 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738115 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738118 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:37.740244 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738120 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738123 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738125 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738128 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738130 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738133 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738135 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738138 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738140 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738144 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738146 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738149 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738151 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738154 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738157 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738159 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738162 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738165 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738167 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:37.740614 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738170 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738172 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738175 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738177 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738180 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738183 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738186 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738188 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738191 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738193 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738196 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738199 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738201 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738204 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738206 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738209 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738211 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738214 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738216 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738219 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:37.741106 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738221 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738224 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738228 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738233 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738236 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738239 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738242 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738244 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738247 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738249 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738252 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738254 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738257 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738259 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738262 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738265 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738267 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738270 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738273 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:37.741589 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738275 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738279 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738282 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738285 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738287 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738290 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738292 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738295 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738298 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738300 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738303 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738305 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738308 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738310 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:37.738313 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.738317 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:37.742074 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.739521 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 19:06:37.742513 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.741983 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 19:06:37.742967 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.742955 2565 server.go:1019] "Starting client certificate rotation" Apr 24 19:06:37.743093 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.743072 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:37.743132 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.743123 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:37.767324 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.767300 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:37.770157 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.770134 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:37.785241 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.785219 2565 log.go:25] "Validated CRI v1 runtime API" Apr 24 19:06:37.790339 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.790321 2565 log.go:25] "Validated CRI v1 image API" Apr 24 19:06:37.791636 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.791613 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 19:06:37.794742 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.794692 2565 fs.go:135] Filesystem UUIDs: map[09d76f15-1606-4ee6-a06a-8c3f0b4d4b94:/dev/nvme0n1p3 58e75924-b958-4e37-9397-b323b580277b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 24 19:06:37.794839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.794733 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 19:06:37.799658 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.799639 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:37.801745 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.801611 2565 manager.go:217] Machine: {Timestamp:2026-04-24 19:06:37.799579396 +0000 UTC m=+0.437587390 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102569 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27371dc82d103a83814c2f3242e27a SystemUUID:ec27371d-c82d-103a-8381-4c2f3242e27a BootID:716b45a5-9e9f-4bb6-9bc8-6896c6503f8b Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:f2:bf:52:ac:d3 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:f2:bf:52:ac:d3 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ea:76:e5:01:93:33 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 19:06:37.802829 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.802817 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 19:06:37.802937 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.802919 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 19:06:37.804665 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.804638 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 19:06:37.804819 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.804668 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-142.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 19:06:37.804866 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.804830 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 19:06:37.804866 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.804839 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 19:06:37.804866 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.804852 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:37.805592 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.805581 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:37.806365 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.806354 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:37.806468 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.806459 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 19:06:37.809023 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.809011 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 24 19:06:37.809063 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.809027 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 19:06:37.809063 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.809039 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 19:06:37.809063 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.809049 2565 kubelet.go:397] "Adding apiserver pod source" Apr 24 19:06:37.809063 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.809061 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 19:06:37.810978 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.810953 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:37.811029 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.810996 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:37.814202 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.814185 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 19:06:37.815488 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.815465 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 19:06:37.817241 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817227 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817246 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817252 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817257 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817264 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817270 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817276 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817282 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817290 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817296 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 19:06:37.817307 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817304 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 19:06:37.817564 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.817312 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 19:06:37.818268 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.818247 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 19:06:37.818268 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.818258 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 19:06:37.820182 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.820133 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 19:06:37.820261 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.820132 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 19:06:37.822002 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.821989 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 19:06:37.822041 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.822027 2565 server.go:1295] "Started kubelet" Apr 24 19:06:37.822694 ip-10-0-138-142 systemd[1]: Started Kubernetes Kubelet. Apr 24 19:06:37.822945 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.822914 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 19:06:37.823279 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.822897 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 19:06:37.823279 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.823108 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 19:06:37.824623 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.824603 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 19:06:37.825942 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.825928 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 24 19:06:37.828779 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.828760 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fz4pw" Apr 24 19:06:37.829827 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.829808 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 19:06:37.829827 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.829824 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:37.831485 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.831465 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 19:06:37.831485 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.831489 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 19:06:37.831669 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.831654 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 24 19:06:37.831669 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.831668 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 24 19:06:37.832416 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.832394 2565 factory.go:55] Registering systemd factory Apr 24 19:06:37.832492 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.832469 2565 factory.go:223] Registration of the systemd container factory successfully Apr 24 19:06:37.833008 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.832995 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 19:06:37.833096 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.833075 2565 factory.go:153] Registering CRI-O factory Apr 24 19:06:37.833303 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.833286 2565 factory.go:223] Registration of the crio container factory successfully Apr 24 19:06:37.833390 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.833364 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 19:06:37.833390 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.833387 2565 factory.go:103] Registering Raw factory Apr 24 19:06:37.833492 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.833410 2565 manager.go:1196] Started watching for new ooms in manager Apr 24 19:06:37.833577 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.833260 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-142.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 19:06:37.834249 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.834232 2565 manager.go:319] Starting recovery of all containers Apr 24 19:06:37.834342 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.833258 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-142.ec2.internal.18a960784f96e553 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-142.ec2.internal,UID:ip-10-0-138-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-142.ec2.internal,},FirstTimestamp:2026-04-24 19:06:37.822002515 +0000 UTC m=+0.460010507,LastTimestamp:2026-04-24 19:06:37.822002515 +0000 UTC m=+0.460010507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-142.ec2.internal,}" Apr 24 19:06:37.835151 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.835134 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:37.836730 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.836687 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 19:06:37.841496 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.841471 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fz4pw" Apr 24 19:06:37.841949 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.841911 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 19:06:37.842097 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.842076 2565 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-142.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 19:06:37.847322 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.847188 2565 manager.go:324] Recovery completed Apr 24 19:06:37.851449 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.851437 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:37.856144 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.856123 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:37.856224 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.856162 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:37.856224 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.856178 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:37.856743 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.856727 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 19:06:37.856743 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.856741 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 19:06:37.856844 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.856761 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:37.858119 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.858043 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-142.ec2.internal.18a96078519fd99e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-142.ec2.internal,UID:ip-10-0-138-142.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-142.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-142.ec2.internal,},FirstTimestamp:2026-04-24 19:06:37.856143774 +0000 UTC m=+0.494151766,LastTimestamp:2026-04-24 19:06:37.856143774 +0000 UTC m=+0.494151766,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-142.ec2.internal,}" Apr 24 19:06:37.859074 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.859061 2565 policy_none.go:49] "None policy: Start" Apr 24 19:06:37.859129 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.859097 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 19:06:37.859129 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.859108 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.896645 2565 manager.go:341] "Starting Device Plugin manager" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.896683 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.896695 2565 server.go:85] "Starting device plugin registration server" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.896949 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.896959 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.897110 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.897206 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.897216 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.897739 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 19:06:37.911246 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.897785 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:37.963542 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.963498 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 19:06:37.964690 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.964667 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 19:06:37.964690 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.964693 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 19:06:37.964853 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.964715 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 19:06:37.964853 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.964723 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 19:06:37.964853 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:37.964763 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 19:06:37.969019 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.968997 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:37.997513 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:37.997426 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:38.003068 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.003049 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:38.003148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.003080 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:38.003148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.003090 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:38.003148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.003112 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.011811 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.011791 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.011909 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.011814 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-142.ec2.internal\": node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.032760 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.032737 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.065134 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.065103 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal"] Apr 24 19:06:38.065288 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.065183 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:38.066762 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.066745 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:38.066848 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.066781 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:38.066848 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.066795 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:38.069042 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069024 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:38.069182 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069164 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.069231 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069200 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:38.069783 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069763 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:38.069783 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069772 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:38.069908 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069796 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:38.069908 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069795 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:38.069908 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069836 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:38.069908 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.069812 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:38.072162 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.072145 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.072251 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.072179 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:38.073384 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.073367 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:38.073455 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.073393 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:38.073455 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.073403 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:38.094799 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.094776 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-142.ec2.internal\" not found" node="ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.099391 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.099366 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-142.ec2.internal\" not found" node="ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.133789 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.133746 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.234535 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.234503 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.234693 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.234577 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.234693 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.234604 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.234693 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.234624 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/027943e939a2d76cdb600f777d89968b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-142.ec2.internal\" (UID: \"027943e939a2d76cdb600f777d89968b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.335293 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.335221 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.335293 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.335282 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.335432 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.335309 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.335432 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.335327 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/027943e939a2d76cdb600f777d89968b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-142.ec2.internal\" (UID: \"027943e939a2d76cdb600f777d89968b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.335432 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.335368 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.335432 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.335371 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/1427eed4f2cd472349fa20b6f1cf215c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal\" (UID: \"1427eed4f2cd472349fa20b6f1cf215c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.335432 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.335398 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/027943e939a2d76cdb600f777d89968b-config\") pod \"kube-apiserver-proxy-ip-10-0-138-142.ec2.internal\" (UID: \"027943e939a2d76cdb600f777d89968b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.396406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.396372 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.401832 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.401813 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 24 19:06:38.436274 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.436242 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.536876 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.536834 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.637443 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.637354 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.738037 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.737983 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.743369 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.743341 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 19:06:38.743505 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.743487 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:06:38.829958 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.829922 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:38.838409 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.838380 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.843489 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.843459 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 19:01:37 +0000 UTC" deadline="2028-01-30 23:05:19.513241978 +0000 UTC" Apr 24 19:06:38.843565 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.843491 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15507h58m40.669754987s" Apr 24 19:06:38.863682 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.863653 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:38.888601 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.888509 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-szrps" Apr 24 19:06:38.897623 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.897584 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-szrps" Apr 24 19:06:38.939439 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:38.939401 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-142.ec2.internal\" not found" Apr 24 19:06:38.970911 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.970890 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:38.976596 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:38.976562 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1427eed4f2cd472349fa20b6f1cf215c.slice/crio-858cb3d586b69fa48540405c54901cd9d9d0fdd6cfb54fbb38711b122584205f WatchSource:0}: Error finding container 858cb3d586b69fa48540405c54901cd9d9d0fdd6cfb54fbb38711b122584205f: Status 404 returned error can't find the container with id 858cb3d586b69fa48540405c54901cd9d9d0fdd6cfb54fbb38711b122584205f Apr 24 19:06:38.977368 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:38.977345 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod027943e939a2d76cdb600f777d89968b.slice/crio-61af71e23f7b87be9f71f0d2f0c45a4cd6c876e255fb5e278c656f8ca930b26c WatchSource:0}: Error finding container 61af71e23f7b87be9f71f0d2f0c45a4cd6c876e255fb5e278c656f8ca930b26c: Status 404 returned error can't find the container with id 61af71e23f7b87be9f71f0d2f0c45a4cd6c876e255fb5e278c656f8ca930b26c Apr 24 19:06:38.980763 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.980744 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:06:38.996700 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:38.996674 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:39.031917 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.031882 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" Apr 24 19:06:39.045150 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.045129 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:39.046025 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.046012 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" Apr 24 19:06:39.059978 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.059957 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:39.192284 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.192259 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:39.725430 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.725399 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:39.810131 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.810095 2565 apiserver.go:52] "Watching apiserver" Apr 24 19:06:39.817926 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.817900 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 19:06:39.818314 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.818289 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7kd45","openshift-multus/multus-additional-cni-plugins-6zxkd","openshift-multus/multus-z9xw6","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal","openshift-multus/network-metrics-daemon-dxx98","openshift-network-diagnostics/network-check-target-7kd4p","openshift-network-operator/iptables-alerter-f2bnb","openshift-ovn-kubernetes/ovnkube-node-5frl6","kube-system/konnectivity-agent-8wdng","kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv","openshift-cluster-node-tuning-operator/tuned-4k6xp"] Apr 24 19:06:39.820798 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.820772 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.823417 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.823396 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.823417 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.823407 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-q9lm4\"" Apr 24 19:06:39.823577 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.823479 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.823640 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.823562 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 19:06:39.826929 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.826903 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.829328 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.829305 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-2kjfz\"" Apr 24 19:06:39.829407 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.829388 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:39.829477 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.829452 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:39.829624 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.829608 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.829748 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.829717 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.829837 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.829770 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 19:06:39.829837 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.829814 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 19:06:39.831618 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.831599 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.831811 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.831791 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:39.831915 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.831858 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:39.833964 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.833801 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 19:06:39.833964 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.833945 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.834190 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.834171 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 19:06:39.834510 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.834495 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-kpm9h\"" Apr 24 19:06:39.836391 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.836237 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.836391 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.836308 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 19:06:39.837050 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.836685 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.837050 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.836812 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pvkkf\"" Apr 24 19:06:39.837050 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.836816 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.838987 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.838953 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-62ck5\"" Apr 24 19:06:39.839247 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.839225 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:39.839356 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.839262 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 19:06:39.839623 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.839605 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 19:06:39.840591 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.840405 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 19:06:39.840591 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.840437 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.840591 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.840438 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.840591 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.840492 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 19:06:39.841802 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.841625 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lf2vs\"" Apr 24 19:06:39.841802 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.841679 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 19:06:39.842027 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.841999 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.842621 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.842602 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 19:06:39.844236 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844193 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-cnibin\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.844408 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844388 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-nzf4q\"" Apr 24 19:06:39.844504 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844405 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.844504 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844447 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.844654 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844638 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.844722 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844362 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 19:06:39.844722 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844714 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-os-release\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.844826 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844749 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.844826 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844779 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq7cv\" (UniqueName: \"kubernetes.io/projected/5998df72-185d-4c96-bec0-af611d2d2173-kube-api-access-xq7cv\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.844826 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844807 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-os-release\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.844966 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844856 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.844966 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844886 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-socket-dir-parent\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.844966 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844911 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-cni-bin\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.844966 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844932 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-conf-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845180 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.844965 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d318f255-7c0b-4460-b175-7e79533a85e2-multus-daemon-config\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845180 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-hostroot\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845180 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845056 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp82b\" (UniqueName: \"kubernetes.io/projected/5bd705be-109f-4e28-aaaf-136053b1e384-kube-api-access-jp82b\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:39.845180 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845096 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/917aae83-6ba1-4789-900b-362379b7e091-iptables-alerter-script\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.845180 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845119 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-serviceca\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.845180 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d318f255-7c0b-4460-b175-7e79533a85e2-cni-binary-copy\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845180 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845169 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-cni-multus\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845198 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845227 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845249 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-host\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845272 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-kubelet\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845295 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/917aae83-6ba1-4789-900b-362379b7e091-host-slash\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845317 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845340 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845363 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-cni-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845417 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-multus-certs\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-etc-kubernetes\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845507 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-system-cni-dir\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845556 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-system-cni-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845595 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-cnibin\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845621 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-k8s-cni-cncf-io\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xns\" (UniqueName: \"kubernetes.io/projected/d318f255-7c0b-4460-b175-7e79533a85e2-kube-api-access-k9xns\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845667 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-netns\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845691 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt8bd\" (UniqueName: \"kubernetes.io/projected/917aae83-6ba1-4789-900b-362379b7e091-kube-api-access-qt8bd\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.845882 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.845716 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws6mf\" (UniqueName: \"kubernetes.io/projected/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-kube-api-access-ws6mf\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.846987 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.846952 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:39.847246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.847102 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:39.847246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.847229 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-clc5q\"" Apr 24 19:06:39.899198 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.899141 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:38 +0000 UTC" deadline="2027-12-08 01:44:15.562795585 +0000 UTC" Apr 24 19:06:39.899198 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.899193 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14214h37m35.663606342s" Apr 24 19:06:39.934799 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.934770 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 19:06:39.946195 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946157 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/917aae83-6ba1-4789-900b-362379b7e091-host-slash\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.946195 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946199 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.946434 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946257 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/917aae83-6ba1-4789-900b-362379b7e091-host-slash\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.946434 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946310 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-cni-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946434 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946340 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-multus-certs\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946434 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-etc-kubernetes\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946434 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946422 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-cni-bin\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946446 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f92f49a6-73c3-429f-83d7-d170a250611b-agent-certs\") pod \"konnectivity-agent-8wdng\" (UID: \"f92f49a6-73c3-429f-83d7-d170a250611b\") " pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946470 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946477 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-multus-certs\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946484 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-cni-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946499 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-system-cni-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946541 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-k8s-cni-cncf-io\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946554 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-etc-kubernetes\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946607 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-system-cni-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.946641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946612 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xns\" (UniqueName: \"kubernetes.io/projected/d318f255-7c0b-4460-b175-7e79533a85e2-kube-api-access-k9xns\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946662 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-k8s-cni-cncf-io\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946800 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-cni-netd\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946804 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946838 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovnkube-config\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946866 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysconfig\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ws6mf\" (UniqueName: \"kubernetes.io/projected/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-kube-api-access-ws6mf\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946940 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-socket-dir-parent\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.946967 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-conf-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947006 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947009 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-ovn\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-socket-dir-parent\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947034 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-node-log\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947062 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-multus-conf-dir\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947084 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-run-ovn-kubernetes\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947120 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw696\" (UniqueName: \"kubernetes.io/projected/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-kube-api-access-vw696\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947156 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-tuned\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947186 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947213 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-cni-bin\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947238 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-systemd-units\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947265 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947290 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-device-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947292 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-cni-bin\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947315 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-hostroot\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947340 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-sys\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947362 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-hostroot\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947389 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.947410 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947397 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-log-socket\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947451 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-etc-selinux\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947486 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q88kb\" (UniqueName: \"kubernetes.io/projected/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-kube-api-access-q88kb\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947517 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-modprobe-d\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947545 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-serviceca\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947624 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d318f255-7c0b-4460-b175-7e79533a85e2-cni-binary-copy\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947953 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-serviceca\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.947955 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-kubelet\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948005 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-host\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948056 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948082 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948114 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-host\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-kubelet\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948164 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-slash\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.948184 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948203 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-registration-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948221 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-host\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.948292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948237 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-kubernetes\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.948272 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs podName:5bd705be-109f-4e28-aaaf-136053b1e384 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:40.448233753 +0000 UTC m=+3.086241734 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs") pod "network-metrics-daemon-dxx98" (UID: "5bd705be-109f-4e28-aaaf-136053b1e384") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948187 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d318f255-7c0b-4460-b175-7e79533a85e2-cni-binary-copy\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948297 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-kubelet\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948427 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948467 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948510 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovn-node-metrics-cert\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948535 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f92f49a6-73c3-429f-83d7-d170a250611b-konnectivity-ca\") pod \"konnectivity-agent-8wdng\" (UID: \"f92f49a6-73c3-429f-83d7-d170a250611b\") " pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948565 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysctl-conf\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948620 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-var-lib-kubelet\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948657 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-system-cni-dir\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948692 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-cnibin\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948717 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-etc-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948753 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-netns\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948780 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt8bd\" (UniqueName: \"kubernetes.io/projected/917aae83-6ba1-4789-900b-362379b7e091-kube-api-access-qt8bd\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948811 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-system-cni-dir\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948820 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-cnibin\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948860 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-cnibin\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948878 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-run-netns\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948893 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-os-release\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948935 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948953 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-cni-binary-copy\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948963 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xq7cv\" (UniqueName: \"kubernetes.io/projected/5998df72-185d-4c96-bec0-af611d2d2173-kube-api-access-xq7cv\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.948957 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-cnibin\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949005 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-os-release\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949029 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d318f255-7c0b-4460-b175-7e79533a85e2-multus-daemon-config\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949045 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5998df72-185d-4c96-bec0-af611d2d2173-os-release\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949055 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-run-netns\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949089 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-os-release\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949146 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-systemd\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949193 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-socket-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949252 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-run\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-tmp\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.949790 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949352 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-env-overrides\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949402 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp82b\" (UniqueName: \"kubernetes.io/projected/5bd705be-109f-4e28-aaaf-136053b1e384-kube-api-access-jp82b\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949440 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-systemd\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949459 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5998df72-185d-4c96-bec0-af611d2d2173-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/917aae83-6ba1-4789-900b-362379b7e091-iptables-alerter-script\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949507 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-cni-multus\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949535 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-var-lib-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949559 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d318f255-7c0b-4460-b175-7e79533a85e2-multus-daemon-config\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949560 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysctl-d\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949615 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wc8w\" (UniqueName: \"kubernetes.io/projected/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-kube-api-access-4wc8w\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949647 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d318f255-7c0b-4460-b175-7e79533a85e2-host-var-lib-cni-multus\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949681 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovnkube-script-lib\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949704 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-sys-fs\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949731 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-lib-modules\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:39.950451 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.949881 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/917aae83-6ba1-4789-900b-362379b7e091-iptables-alerter-script\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.959857 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.959826 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 19:06:39.961864 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.961837 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:39.961864 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.961868 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:39.962069 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.961884 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dnxpl for pod openshift-network-diagnostics/network-check-target-7kd4p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:39.962069 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:39.962024 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl podName:127bc72d-f6ec-4cad-8e85-c35350f4ded7 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:40.462003589 +0000 UTC m=+3.100011571 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dnxpl" (UniqueName: "kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl") pod "network-check-target-7kd4p" (UID: "127bc72d-f6ec-4cad-8e85-c35350f4ded7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:39.963914 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.963822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws6mf\" (UniqueName: \"kubernetes.io/projected/281b0be6-1dcd-4e12-a8a1-7617ae9111e8-kube-api-access-ws6mf\") pod \"node-ca-7kd45\" (UID: \"281b0be6-1dcd-4e12-a8a1-7617ae9111e8\") " pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:39.963914 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.963822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt8bd\" (UniqueName: \"kubernetes.io/projected/917aae83-6ba1-4789-900b-362379b7e091-kube-api-access-qt8bd\") pod \"iptables-alerter-f2bnb\" (UID: \"917aae83-6ba1-4789-900b-362379b7e091\") " pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:39.963914 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.963872 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq7cv\" (UniqueName: \"kubernetes.io/projected/5998df72-185d-4c96-bec0-af611d2d2173-kube-api-access-xq7cv\") pod \"multus-additional-cni-plugins-6zxkd\" (UID: \"5998df72-185d-4c96-bec0-af611d2d2173\") " pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:39.964114 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.963949 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp82b\" (UniqueName: \"kubernetes.io/projected/5bd705be-109f-4e28-aaaf-136053b1e384-kube-api-access-jp82b\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:39.964356 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.964339 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xns\" (UniqueName: \"kubernetes.io/projected/d318f255-7c0b-4460-b175-7e79533a85e2-kube-api-access-k9xns\") pod \"multus-z9xw6\" (UID: \"d318f255-7c0b-4460-b175-7e79533a85e2\") " pod="openshift-multus/multus-z9xw6" Apr 24 19:06:39.974735 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.974679 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" event={"ID":"1427eed4f2cd472349fa20b6f1cf215c","Type":"ContainerStarted","Data":"858cb3d586b69fa48540405c54901cd9d9d0fdd6cfb54fbb38711b122584205f"} Apr 24 19:06:39.975609 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:39.975544 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" event={"ID":"027943e939a2d76cdb600f777d89968b","Type":"ContainerStarted","Data":"61af71e23f7b87be9f71f0d2f0c45a4cd6c876e255fb5e278c656f8ca930b26c"} Apr 24 19:06:40.050723 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050681 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-etc-selinux\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.050723 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050735 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q88kb\" (UniqueName: \"kubernetes.io/projected/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-kube-api-access-q88kb\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.050955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050762 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-modprobe-d\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.050955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050780 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-kubelet\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.050955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050805 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-host\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.050955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050880 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-etc-selinux\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.050955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050910 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-modprobe-d\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.050955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050887 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-host\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.050955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.050916 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-kubelet\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051007 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-slash\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051035 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-registration-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-kubernetes\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051091 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051117 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovn-node-metrics-cert\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051157 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f92f49a6-73c3-429f-83d7-d170a250611b-konnectivity-ca\") pod \"konnectivity-agent-8wdng\" (UID: \"f92f49a6-73c3-429f-83d7-d170a250611b\") " pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051170 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-kubernetes\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051180 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysctl-conf\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051118 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-slash\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051218 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-var-lib-kubelet\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051228 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-registration-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051248 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-etc-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051287 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051280 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-run-netns\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051303 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-systemd\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051336 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-socket-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051346 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysctl-conf\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-run\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-tmp\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051405 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-env-overrides\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051433 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-systemd\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051462 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-var-lib-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051485 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysctl-d\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051512 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wc8w\" (UniqueName: \"kubernetes.io/projected/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-kube-api-access-4wc8w\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovnkube-script-lib\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051182 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051561 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-sys-fs\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051587 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-lib-modules\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051617 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-cni-bin\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051627 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-var-lib-kubelet\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.051831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051636 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-socket-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051694 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-run\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051694 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-run-netns\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051727 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-systemd\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051766 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-etc-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051856 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-sys-fs\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052022 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-lib-modules\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052101 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-var-lib-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052157 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-systemd\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.051644 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f92f49a6-73c3-429f-83d7-d170a250611b-agent-certs\") pod \"konnectivity-agent-8wdng\" (UID: \"f92f49a6-73c3-429f-83d7-d170a250611b\") " pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052205 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052216 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-env-overrides\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052236 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-cni-netd\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052270 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysctl-d\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052277 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-cni-netd\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052284 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-kubelet-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052291 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/f92f49a6-73c3-429f-83d7-d170a250611b-konnectivity-ca\") pod \"konnectivity-agent-8wdng\" (UID: \"f92f49a6-73c3-429f-83d7-d170a250611b\") " pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052320 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovnkube-config\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.052585 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052322 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-cni-bin\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052384 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysconfig\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-ovn\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052450 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-node-log\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052461 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-sysconfig\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052478 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-run-ovn-kubernetes\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-ovn\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052504 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-node-log\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052507 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw696\" (UniqueName: \"kubernetes.io/projected/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-kube-api-access-vw696\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052509 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovnkube-script-lib\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052537 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-tuned\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052540 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-host-run-ovn-kubernetes\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052567 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-systemd-units\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052594 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052620 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-device-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052662 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-sys\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052691 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-log-socket\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.053446 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052664 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-device-dir\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.054258 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052723 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-run-openvswitch\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.054258 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-systemd-units\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.054258 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052758 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-log-socket\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.054258 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052762 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-sys\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.054258 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.052766 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovnkube-config\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.054258 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.054247 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-tmp\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.054583 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.054276 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-ovn-node-metrics-cert\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.054583 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.054438 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/f92f49a6-73c3-429f-83d7-d170a250611b-agent-certs\") pod \"konnectivity-agent-8wdng\" (UID: \"f92f49a6-73c3-429f-83d7-d170a250611b\") " pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:40.054720 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.054699 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-etc-tuned\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.061685 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.061656 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wc8w\" (UniqueName: \"kubernetes.io/projected/82b67635-7eb8-44b6-9ea2-8d4e7edf454a-kube-api-access-4wc8w\") pod \"tuned-4k6xp\" (UID: \"82b67635-7eb8-44b6-9ea2-8d4e7edf454a\") " pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.062431 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.062413 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q88kb\" (UniqueName: \"kubernetes.io/projected/226dc5e6-2421-4eb0-8a0b-d99de2ce9568-kube-api-access-q88kb\") pod \"aws-ebs-csi-driver-node-dqtqv\" (UID: \"226dc5e6-2421-4eb0-8a0b-d99de2ce9568\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.062529 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.062481 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw696\" (UniqueName: \"kubernetes.io/projected/ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6-kube-api-access-vw696\") pod \"ovnkube-node-5frl6\" (UID: \"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6\") " pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.132481 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.132437 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7kd45" Apr 24 19:06:40.139447 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.139424 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z9xw6" Apr 24 19:06:40.149177 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.149148 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" Apr 24 19:06:40.157034 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.157007 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-f2bnb" Apr 24 19:06:40.163713 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.163687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:06:40.170378 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.170355 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:40.176053 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.176031 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" Apr 24 19:06:40.181921 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.181900 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" Apr 24 19:06:40.456266 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.456223 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:40.456447 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:40.456350 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:40.456447 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:40.456430 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs podName:5bd705be-109f-4e28-aaaf-136053b1e384 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:41.456410698 +0000 UTC m=+4.094418682 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs") pod "network-metrics-daemon-dxx98" (UID: "5bd705be-109f-4e28-aaaf-136053b1e384") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:40.557209 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.557177 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:40.557350 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:40.557333 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:40.557391 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:40.557353 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:40.557391 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:40.557364 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dnxpl for pod openshift-network-diagnostics/network-check-target-7kd4p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:40.557456 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:40.557426 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl podName:127bc72d-f6ec-4cad-8e85-c35350f4ded7 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:41.55741262 +0000 UTC m=+4.195420599 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnxpl" (UniqueName: "kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl") pod "network-check-target-7kd4p" (UID: "127bc72d-f6ec-4cad-8e85-c35350f4ded7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:40.569478 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.569449 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac65a5f6_8127_4bb6_a3be_c5ed95f56ae6.slice/crio-6f64bead56e47731f75cfee136bd585d06a484f63457282806ba2414fe2fa626 WatchSource:0}: Error finding container 6f64bead56e47731f75cfee136bd585d06a484f63457282806ba2414fe2fa626: Status 404 returned error can't find the container with id 6f64bead56e47731f75cfee136bd585d06a484f63457282806ba2414fe2fa626 Apr 24 19:06:40.570631 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.570588 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf92f49a6_73c3_429f_83d7_d170a250611b.slice/crio-9f1c4bedaeb55f8f05f7abc3885ac7ff34a30ba2623d88e11ad26c93c670b12a WatchSource:0}: Error finding container 9f1c4bedaeb55f8f05f7abc3885ac7ff34a30ba2623d88e11ad26c93c670b12a: Status 404 returned error can't find the container with id 9f1c4bedaeb55f8f05f7abc3885ac7ff34a30ba2623d88e11ad26c93c670b12a Apr 24 19:06:40.571374 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.571347 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5998df72_185d_4c96_bec0_af611d2d2173.slice/crio-1147d09e7c521e36fc89111d1332433a16277966710b5cf4f063445750a03c07 WatchSource:0}: Error finding container 1147d09e7c521e36fc89111d1332433a16277966710b5cf4f063445750a03c07: Status 404 returned error can't find the container with id 1147d09e7c521e36fc89111d1332433a16277966710b5cf4f063445750a03c07 Apr 24 19:06:40.574348 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.574314 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd318f255_7c0b_4460_b175_7e79533a85e2.slice/crio-ddc94c7bdd0a517f0011371ab061279389a4ad43371f23d526bdfc2b08c234e7 WatchSource:0}: Error finding container ddc94c7bdd0a517f0011371ab061279389a4ad43371f23d526bdfc2b08c234e7: Status 404 returned error can't find the container with id ddc94c7bdd0a517f0011371ab061279389a4ad43371f23d526bdfc2b08c234e7 Apr 24 19:06:40.575501 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.575474 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82b67635_7eb8_44b6_9ea2_8d4e7edf454a.slice/crio-db7d0882124101f5d4c3467a129f33c40aa361541606e477d5b5228d7e2dc9f4 WatchSource:0}: Error finding container db7d0882124101f5d4c3467a129f33c40aa361541606e477d5b5228d7e2dc9f4: Status 404 returned error can't find the container with id db7d0882124101f5d4c3467a129f33c40aa361541606e477d5b5228d7e2dc9f4 Apr 24 19:06:40.576312 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.576287 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281b0be6_1dcd_4e12_a8a1_7617ae9111e8.slice/crio-8e349392a8fb16cfc85a7d852c6a87420ea85c1e4fee44d52e4c560950eb5c07 WatchSource:0}: Error finding container 8e349392a8fb16cfc85a7d852c6a87420ea85c1e4fee44d52e4c560950eb5c07: Status 404 returned error can't find the container with id 8e349392a8fb16cfc85a7d852c6a87420ea85c1e4fee44d52e4c560950eb5c07 Apr 24 19:06:40.577215 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.577127 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod917aae83_6ba1_4789_900b_362379b7e091.slice/crio-f28dd77381178e38d14bd8385023671c4b583020598a799eb43b1c7f1d6736dc WatchSource:0}: Error finding container f28dd77381178e38d14bd8385023671c4b583020598a799eb43b1c7f1d6736dc: Status 404 returned error can't find the container with id f28dd77381178e38d14bd8385023671c4b583020598a799eb43b1c7f1d6736dc Apr 24 19:06:40.578633 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:06:40.578316 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod226dc5e6_2421_4eb0_8a0b_d99de2ce9568.slice/crio-a0279032231f7b1334a72a7db31b19f84f43e17ae571690a719b7a94a523647a WatchSource:0}: Error finding container a0279032231f7b1334a72a7db31b19f84f43e17ae571690a719b7a94a523647a: Status 404 returned error can't find the container with id a0279032231f7b1334a72a7db31b19f84f43e17ae571690a719b7a94a523647a Apr 24 19:06:40.899995 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.899578 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:38 +0000 UTC" deadline="2028-02-09 18:26:15.64011753 +0000 UTC" Apr 24 19:06:40.899995 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.899844 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15743h19m34.74028164s" Apr 24 19:06:40.981107 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.981052 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" event={"ID":"226dc5e6-2421-4eb0-8a0b-d99de2ce9568","Type":"ContainerStarted","Data":"a0279032231f7b1334a72a7db31b19f84f43e17ae571690a719b7a94a523647a"} Apr 24 19:06:40.982442 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.982382 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7kd45" event={"ID":"281b0be6-1dcd-4e12-a8a1-7617ae9111e8","Type":"ContainerStarted","Data":"8e349392a8fb16cfc85a7d852c6a87420ea85c1e4fee44d52e4c560950eb5c07"} Apr 24 19:06:40.985575 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.985534 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z9xw6" event={"ID":"d318f255-7c0b-4460-b175-7e79533a85e2","Type":"ContainerStarted","Data":"ddc94c7bdd0a517f0011371ab061279389a4ad43371f23d526bdfc2b08c234e7"} Apr 24 19:06:40.994911 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:40.994878 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8wdng" event={"ID":"f92f49a6-73c3-429f-83d7-d170a250611b","Type":"ContainerStarted","Data":"9f1c4bedaeb55f8f05f7abc3885ac7ff34a30ba2623d88e11ad26c93c670b12a"} Apr 24 19:06:41.003767 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.003704 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"6f64bead56e47731f75cfee136bd585d06a484f63457282806ba2414fe2fa626"} Apr 24 19:06:41.008732 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.008700 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" event={"ID":"027943e939a2d76cdb600f777d89968b","Type":"ContainerStarted","Data":"d44d637e45e0a18932ddeadf0f08a58597f2ab4efd46c91fecfa1ca9bc8f186f"} Apr 24 19:06:41.012326 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.012218 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f2bnb" event={"ID":"917aae83-6ba1-4789-900b-362379b7e091","Type":"ContainerStarted","Data":"f28dd77381178e38d14bd8385023671c4b583020598a799eb43b1c7f1d6736dc"} Apr 24 19:06:41.015888 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.015861 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" event={"ID":"82b67635-7eb8-44b6-9ea2-8d4e7edf454a","Type":"ContainerStarted","Data":"db7d0882124101f5d4c3467a129f33c40aa361541606e477d5b5228d7e2dc9f4"} Apr 24 19:06:41.024604 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.024571 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerStarted","Data":"1147d09e7c521e36fc89111d1332433a16277966710b5cf4f063445750a03c07"} Apr 24 19:06:41.465480 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.464878 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:41.465480 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.465058 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:41.465480 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.465126 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs podName:5bd705be-109f-4e28-aaaf-136053b1e384 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.46510647 +0000 UTC m=+6.103114454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs") pod "network-metrics-daemon-dxx98" (UID: "5bd705be-109f-4e28-aaaf-136053b1e384") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:41.565815 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.565775 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:41.566067 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.566009 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:41.566067 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.566029 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:41.566067 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.566043 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dnxpl for pod openshift-network-diagnostics/network-check-target-7kd4p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:41.566254 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.566105 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl podName:127bc72d-f6ec-4cad-8e85-c35350f4ded7 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.566087542 +0000 UTC m=+6.204095525 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnxpl" (UniqueName: "kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl") pod "network-check-target-7kd4p" (UID: "127bc72d-f6ec-4cad-8e85-c35350f4ded7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:41.965189 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.965156 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:41.965645 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.965294 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:41.966003 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:41.965813 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:41.966003 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:41.965913 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:42.035041 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.034454 2565 generic.go:358] "Generic (PLEG): container finished" podID="1427eed4f2cd472349fa20b6f1cf215c" containerID="2b1f99489d96db6fe51bee64dd1f1d5d73ad5b63fb8fb397ee37d787cdff1923" exitCode=0 Apr 24 19:06:42.036207 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.035923 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" event={"ID":"1427eed4f2cd472349fa20b6f1cf215c","Type":"ContainerDied","Data":"2b1f99489d96db6fe51bee64dd1f1d5d73ad5b63fb8fb397ee37d787cdff1923"} Apr 24 19:06:42.049683 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.049626 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-142.ec2.internal" podStartSLOduration=3.049606494 podStartE2EDuration="3.049606494s" podCreationTimestamp="2026-04-24 19:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:41.023045523 +0000 UTC m=+3.661053525" watchObservedRunningTime="2026-04-24 19:06:42.049606494 +0000 UTC m=+4.687614498" Apr 24 19:06:42.184073 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.184028 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-qh6dj"] Apr 24 19:06:42.187309 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.187284 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.187443 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:42.187366 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:42.272560 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.272235 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/71bb1727-5459-4ea6-ade3-0cf8e2a63038-dbus\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.272560 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.272287 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.272560 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.272321 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/71bb1727-5459-4ea6-ade3-0cf8e2a63038-kubelet-config\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.374148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.373604 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/71bb1727-5459-4ea6-ade3-0cf8e2a63038-dbus\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.374148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.373650 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.374148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.373685 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/71bb1727-5459-4ea6-ade3-0cf8e2a63038-kubelet-config\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.374148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.373805 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/71bb1727-5459-4ea6-ade3-0cf8e2a63038-kubelet-config\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.374148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.373940 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/71bb1727-5459-4ea6-ade3-0cf8e2a63038-dbus\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.374148 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:42.374065 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:42.374148 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:42.374122 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret podName:71bb1727-5459-4ea6-ade3-0cf8e2a63038 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:42.874103141 +0000 UTC m=+5.512111123 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret") pod "global-pull-secret-syncer-qh6dj" (UID: "71bb1727-5459-4ea6-ade3-0cf8e2a63038") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:42.879389 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:42.878797 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:42.879389 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:42.878963 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:42.879389 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:42.879041 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret podName:71bb1727-5459-4ea6-ade3-0cf8e2a63038 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:43.879021181 +0000 UTC m=+6.517029162 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret") pod "global-pull-secret-syncer-qh6dj" (UID: "71bb1727-5459-4ea6-ade3-0cf8e2a63038") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:43.046498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:43.046457 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" event={"ID":"1427eed4f2cd472349fa20b6f1cf215c","Type":"ContainerStarted","Data":"44fb30448b61d4c23b92ff7333522b66c4e6a75347b0dbf6a8b859ca0e948b23"} Apr 24 19:06:43.483561 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:43.483482 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:43.483742 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.483648 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:43.483742 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.483719 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs podName:5bd705be-109f-4e28-aaaf-136053b1e384 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:47.483700742 +0000 UTC m=+10.121708738 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs") pod "network-metrics-daemon-dxx98" (UID: "5bd705be-109f-4e28-aaaf-136053b1e384") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:43.584626 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:43.584588 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:43.584813 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.584775 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:43.584813 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.584797 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:43.584813 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.584809 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dnxpl for pod openshift-network-diagnostics/network-check-target-7kd4p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:43.584958 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.584875 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl podName:127bc72d-f6ec-4cad-8e85-c35350f4ded7 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:47.58485645 +0000 UTC m=+10.222864445 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnxpl" (UniqueName: "kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl") pod "network-check-target-7kd4p" (UID: "127bc72d-f6ec-4cad-8e85-c35350f4ded7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:43.886612 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:43.886451 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:43.886781 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.886616 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:43.886781 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.886692 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret podName:71bb1727-5459-4ea6-ade3-0cf8e2a63038 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:45.886671967 +0000 UTC m=+8.524679962 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret") pod "global-pull-secret-syncer-qh6dj" (UID: "71bb1727-5459-4ea6-ade3-0cf8e2a63038") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:43.965381 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:43.965300 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:43.965552 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.965502 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:43.965955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:43.965935 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:43.966302 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.966095 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:43.966302 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:43.966183 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:43.966302 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:43.966262 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:45.905111 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:45.905066 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:45.905590 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:45.905247 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:45.905590 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:45.905332 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret podName:71bb1727-5459-4ea6-ade3-0cf8e2a63038 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:49.905309297 +0000 UTC m=+12.543317279 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret") pod "global-pull-secret-syncer-qh6dj" (UID: "71bb1727-5459-4ea6-ade3-0cf8e2a63038") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:45.966432 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:45.966054 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:45.966432 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:45.966184 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:45.966671 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:45.966534 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:45.966671 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:45.966579 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:45.966774 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:45.966678 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:45.966774 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:45.966761 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:47.519269 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:47.519232 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:47.519684 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.519406 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:47.519684 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.519493 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs podName:5bd705be-109f-4e28-aaaf-136053b1e384 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:55.51947166 +0000 UTC m=+18.157479649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs") pod "network-metrics-daemon-dxx98" (UID: "5bd705be-109f-4e28-aaaf-136053b1e384") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:47.620288 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:47.620251 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:47.620475 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.620430 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:47.620475 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.620455 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:47.620475 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.620470 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dnxpl for pod openshift-network-diagnostics/network-check-target-7kd4p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:47.620633 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.620534 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl podName:127bc72d-f6ec-4cad-8e85-c35350f4ded7 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:55.620515866 +0000 UTC m=+18.258523850 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnxpl" (UniqueName: "kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl") pod "network-check-target-7kd4p" (UID: "127bc72d-f6ec-4cad-8e85-c35350f4ded7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:47.966656 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:47.966557 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:47.966853 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.966668 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:47.967234 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:47.967009 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:47.967234 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:47.967025 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:47.967234 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.967137 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:47.967234 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:47.967230 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:49.937280 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:49.937239 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:49.937691 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:49.937402 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:49.937691 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:49.937470 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret podName:71bb1727-5459-4ea6-ade3-0cf8e2a63038 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:57.937455763 +0000 UTC m=+20.575463743 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret") pod "global-pull-secret-syncer-qh6dj" (UID: "71bb1727-5459-4ea6-ade3-0cf8e2a63038") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:49.965672 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:49.965637 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:49.965864 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:49.965687 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:49.965864 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:49.965728 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:49.965864 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:49.965837 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:49.966036 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:49.965931 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:49.966086 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:49.966047 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:51.965419 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:51.965390 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:51.965917 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:51.965397 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:51.965917 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:51.965527 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:51.965917 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:51.965396 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:51.965917 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:51.965660 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:51.965917 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:51.965780 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:53.965933 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:53.965893 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:53.966439 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:53.965893 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:53.966439 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:53.966042 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:53.966439 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:53.965893 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:53.966439 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:53.966116 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:53.966439 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:53.966163 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:55.573860 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:55.573816 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:55.574258 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.573964 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:55.574258 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.574052 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs podName:5bd705be-109f-4e28-aaaf-136053b1e384 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.574037169 +0000 UTC m=+34.212045149 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs") pod "network-metrics-daemon-dxx98" (UID: "5bd705be-109f-4e28-aaaf-136053b1e384") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:55.674720 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:55.674673 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:55.674903 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.674852 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:55.674903 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.674877 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:55.674903 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.674891 2565 projected.go:194] Error preparing data for projected volume kube-api-access-dnxpl for pod openshift-network-diagnostics/network-check-target-7kd4p: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:55.675082 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.674959 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl podName:127bc72d-f6ec-4cad-8e85-c35350f4ded7 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.674938771 +0000 UTC m=+34.312946764 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dnxpl" (UniqueName: "kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl") pod "network-check-target-7kd4p" (UID: "127bc72d-f6ec-4cad-8e85-c35350f4ded7") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:55.965944 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:55.965909 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:55.966144 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:55.965994 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:55.966144 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:55.966101 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:55.966144 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.966115 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:55.966300 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.966210 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:55.966357 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:55.966326 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:57.969604 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:57.969256 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:57.970248 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:57.969694 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:06:57.970248 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:57.969778 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:57.970248 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:57.969784 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:57.970413 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:57.970251 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:57.970471 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:57.970407 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:57.991482 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:57.991453 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:57.991615 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:57.991568 2565 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:57.991654 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:57.991620 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret podName:71bb1727-5459-4ea6-ade3-0cf8e2a63038 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:13.991607166 +0000 UTC m=+36.629615146 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret") pod "global-pull-secret-syncer-qh6dj" (UID: "71bb1727-5459-4ea6-ade3-0cf8e2a63038") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:06:58.074683 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.074648 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" event={"ID":"82b67635-7eb8-44b6-9ea2-8d4e7edf454a","Type":"ContainerStarted","Data":"8ea4fcb8b3a65a96cb09ccc5572ea1a72c8aa698b92853e47d3a69fb0893b035"} Apr 24 19:06:58.076231 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.076202 2565 generic.go:358] "Generic (PLEG): container finished" podID="5998df72-185d-4c96-bec0-af611d2d2173" containerID="581d3928d64242fb57efa7a6fe2014c7b23b0fb3839735a9f9a2195dd3664714" exitCode=0 Apr 24 19:06:58.076347 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.076252 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerDied","Data":"581d3928d64242fb57efa7a6fe2014c7b23b0fb3839735a9f9a2195dd3664714"} Apr 24 19:06:58.077808 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.077766 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" event={"ID":"226dc5e6-2421-4eb0-8a0b-d99de2ce9568","Type":"ContainerStarted","Data":"ef7ad988b0e31118d7c212ad4b5eb364dd9467b2f79cb1a145b51179c1a8e1eb"} Apr 24 19:06:58.079341 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.079228 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7kd45" event={"ID":"281b0be6-1dcd-4e12-a8a1-7617ae9111e8","Type":"ContainerStarted","Data":"99adae9b41d3b8df04311024c91f2600933bbd49fdee03b69d3049300617fea9"} Apr 24 19:06:58.080736 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.080711 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z9xw6" event={"ID":"d318f255-7c0b-4460-b175-7e79533a85e2","Type":"ContainerStarted","Data":"b1c735a037e4be784f0da2535ada23775546bda02ad7f7fdc92598d947689e23"} Apr 24 19:06:58.082383 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.082365 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-8wdng" event={"ID":"f92f49a6-73c3-429f-83d7-d170a250611b","Type":"ContainerStarted","Data":"9e2a1bc7d05d60ad4e41eb3edabaf96d9471edace032ef6ca76b773c36d1bb09"} Apr 24 19:06:58.084274 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.084257 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:06:58.084568 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.084549 2565 generic.go:358] "Generic (PLEG): container finished" podID="ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6" containerID="4e6b36df3848257e54ba150b58f06b6d2e0ea3d02072f6dd8f27888709948bba" exitCode=1 Apr 24 19:06:58.084632 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.084578 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"a577c7d224a1fed27b9c19ba6b12e5557c19f02598bb04931fbe79138dd4d1fb"} Apr 24 19:06:58.084632 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.084593 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerDied","Data":"4e6b36df3848257e54ba150b58f06b6d2e0ea3d02072f6dd8f27888709948bba"} Apr 24 19:06:58.084632 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.084603 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"290b1b24e195a6a49cb5a01c77a49943d832e3741e58a2fea107e21ce95ce4f9"} Apr 24 19:06:58.090282 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.090243 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-142.ec2.internal" podStartSLOduration=19.090230176 podStartE2EDuration="19.090230176s" podCreationTimestamp="2026-04-24 19:06:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:43.061016155 +0000 UTC m=+5.699024153" watchObservedRunningTime="2026-04-24 19:06:58.090230176 +0000 UTC m=+20.728238174" Apr 24 19:06:58.104377 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.104335 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-8wdng" podStartSLOduration=3.27574009 podStartE2EDuration="20.104318928s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.573297658 +0000 UTC m=+3.211305637" lastFinishedPulling="2026-04-24 19:06:57.401876496 +0000 UTC m=+20.039884475" observedRunningTime="2026-04-24 19:06:58.104146471 +0000 UTC m=+20.742154473" watchObservedRunningTime="2026-04-24 19:06:58.104318928 +0000 UTC m=+20.742326930" Apr 24 19:06:58.104600 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.104571 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4k6xp" podStartSLOduration=3.279768057 podStartE2EDuration="20.104563372s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.577076263 +0000 UTC m=+3.215084257" lastFinishedPulling="2026-04-24 19:06:57.40187158 +0000 UTC m=+20.039879572" observedRunningTime="2026-04-24 19:06:58.090042498 +0000 UTC m=+20.728050496" watchObservedRunningTime="2026-04-24 19:06:58.104563372 +0000 UTC m=+20.742571373" Apr 24 19:06:58.153149 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.152866 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7kd45" podStartSLOduration=3.306054722 podStartE2EDuration="20.152830758s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.578529567 +0000 UTC m=+3.216537551" lastFinishedPulling="2026-04-24 19:06:57.425305595 +0000 UTC m=+20.063313587" observedRunningTime="2026-04-24 19:06:58.13587556 +0000 UTC m=+20.773883558" watchObservedRunningTime="2026-04-24 19:06:58.152830758 +0000 UTC m=+20.790838772" Apr 24 19:06:58.929817 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:58.929792 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 19:06:59.089819 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.089723 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-f2bnb" event={"ID":"917aae83-6ba1-4789-900b-362379b7e091","Type":"ContainerStarted","Data":"9355e1c5d45c73e1fe3c93224f463feb7dcdb8ab16350f6647ce1d99f16bffcb"} Apr 24 19:06:59.092120 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.092091 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" event={"ID":"226dc5e6-2421-4eb0-8a0b-d99de2ce9568","Type":"ContainerStarted","Data":"8ec728588cf950b8dbc96d4d68dcbe1f560ff9eaafdd70bcf7f2b414d9ebc9ab"} Apr 24 19:06:59.097616 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.097588 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:06:59.098060 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.098012 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"9f7dd3e62b920c5469b84696fff093897fb95fe4366faacba82af8a85b22be35"} Apr 24 19:06:59.098161 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.098073 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"11861bc0396e63fb49e13fa16b28fa45a58b92f0bb2a3fdf32481cf8d4df1ef2"} Apr 24 19:06:59.098161 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.098088 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"79e57e47b7704fb1a4b7a7ef1d9906587c0de60aa011d5e5d382b6fa2ff7b8b2"} Apr 24 19:06:59.105234 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.105192 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z9xw6" podStartSLOduration=4.220875457 podStartE2EDuration="21.105180306s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.576156158 +0000 UTC m=+3.214164150" lastFinishedPulling="2026-04-24 19:06:57.460461006 +0000 UTC m=+20.098468999" observedRunningTime="2026-04-24 19:06:58.15220897 +0000 UTC m=+20.790216972" watchObservedRunningTime="2026-04-24 19:06:59.105180306 +0000 UTC m=+21.743188318" Apr 24 19:06:59.105431 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.105408 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-f2bnb" podStartSLOduration=4.28302145 podStartE2EDuration="21.105402262s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.579489725 +0000 UTC m=+3.217497718" lastFinishedPulling="2026-04-24 19:06:57.401870548 +0000 UTC m=+20.039878530" observedRunningTime="2026-04-24 19:06:59.104737521 +0000 UTC m=+21.742745523" watchObservedRunningTime="2026-04-24 19:06:59.105402262 +0000 UTC m=+21.743410265" Apr 24 19:06:59.773221 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.773187 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:59.773811 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.773786 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:06:59.909040 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.908930 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T19:06:58.929810124Z","UUID":"de16f5c1-1a64-40c0-8915-1fbf4a92703e","Handler":null,"Name":"","Endpoint":""} Apr 24 19:06:59.911131 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.911110 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 19:06:59.911244 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.911141 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 19:06:59.965380 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.965350 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:06:59.965561 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.965350 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:06:59.965561 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:59.965479 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:06:59.965561 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:06:59.965355 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:06:59.965735 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:59.965565 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:06:59.965735 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:06:59.965622 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:07:00.102707 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:00.102595 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" event={"ID":"226dc5e6-2421-4eb0-8a0b-d99de2ce9568","Type":"ContainerStarted","Data":"c5d361b8264488b4ad3428c6a65ab8dd5e0f893528d118315675697dd48bcb46"} Apr 24 19:07:00.102707 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:00.102693 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:07:00.103362 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:00.103343 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-8wdng" Apr 24 19:07:00.117671 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:00.117624 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-dqtqv" podStartSLOduration=2.900464863 podStartE2EDuration="22.117606904s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.580495274 +0000 UTC m=+3.218503268" lastFinishedPulling="2026-04-24 19:06:59.797637317 +0000 UTC m=+22.435645309" observedRunningTime="2026-04-24 19:07:00.117325326 +0000 UTC m=+22.755333328" watchObservedRunningTime="2026-04-24 19:07:00.117606904 +0000 UTC m=+22.755614906" Apr 24 19:07:01.106858 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:01.106819 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:07:01.107413 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:01.107256 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"d7590fb200b236d0f80d12728efbc86f0c19834b12cc7cda9b6004da60b250c3"} Apr 24 19:07:01.965187 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:01.965153 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:01.965398 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:01.965154 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:01.965398 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:01.965282 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:07:01.965529 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:01.965388 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:07:01.965529 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:01.965166 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:01.965529 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:01.965496 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:07:03.112441 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.112200 2565 generic.go:358] "Generic (PLEG): container finished" podID="5998df72-185d-4c96-bec0-af611d2d2173" containerID="74cb8f2c2ec552f9892ecf6dd4393820f2ace18d5787fcf34213ed2a490acb60" exitCode=0 Apr 24 19:07:03.112441 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.112285 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerDied","Data":"74cb8f2c2ec552f9892ecf6dd4393820f2ace18d5787fcf34213ed2a490acb60"} Apr 24 19:07:03.115479 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.115457 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:07:03.115808 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.115787 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"122cbfbcfa9c65ac6e8bd3afd5b6c6ee39ebc467e9d3e8256b01d270a2470db6"} Apr 24 19:07:03.116113 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.116096 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:07:03.116178 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.116123 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:07:03.116302 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.116271 2565 scope.go:117] "RemoveContainer" containerID="4e6b36df3848257e54ba150b58f06b6d2e0ea3d02072f6dd8f27888709948bba" Apr 24 19:07:03.131576 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.131554 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:07:03.968004 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.967262 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:03.968004 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:03.967644 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:07:03.968663 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.968362 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:03.968663 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:03.968455 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:07:03.968663 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:03.968542 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:03.968663 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:03.968618 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:07:04.120686 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.120661 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:07:04.121093 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.120957 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" event={"ID":"ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6","Type":"ContainerStarted","Data":"5b1bd41db6388fbe5081c557d8541526d3c56f13ab986dfcb6b7be0db5383c2b"} Apr 24 19:07:04.121210 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.121194 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:07:04.122820 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.122791 2565 generic.go:358] "Generic (PLEG): container finished" podID="5998df72-185d-4c96-bec0-af611d2d2173" containerID="7a97cb93696c93fceee4aad6b310c15237e707492fc134df9f9217ff3c0fa8e7" exitCode=0 Apr 24 19:07:04.122938 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.122841 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerDied","Data":"7a97cb93696c93fceee4aad6b310c15237e707492fc134df9f9217ff3c0fa8e7"} Apr 24 19:07:04.135768 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.135663 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:07:04.154124 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.154074 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" podStartSLOduration=9.228687609 podStartE2EDuration="26.154060086s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.572207056 +0000 UTC m=+3.210215035" lastFinishedPulling="2026-04-24 19:06:57.497579533 +0000 UTC m=+20.135587512" observedRunningTime="2026-04-24 19:07:04.152432162 +0000 UTC m=+26.790440173" watchObservedRunningTime="2026-04-24 19:07:04.154060086 +0000 UTC m=+26.792068108" Apr 24 19:07:04.609602 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.609571 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qh6dj"] Apr 24 19:07:04.609759 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.609667 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:04.609759 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:04.609745 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:07:04.613264 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.613234 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7kd4p"] Apr 24 19:07:04.613388 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.613342 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:04.613476 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:04.613451 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:07:04.622689 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.622660 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dxx98"] Apr 24 19:07:04.622844 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:04.622764 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:04.622900 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:04.622869 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:07:05.127171 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:05.127076 2565 generic.go:358] "Generic (PLEG): container finished" podID="5998df72-185d-4c96-bec0-af611d2d2173" containerID="ad83c225eb94fbee2d493f1a27a394e223bc52eeafcebdbaf5e013930fafdc13" exitCode=0 Apr 24 19:07:05.127171 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:05.127142 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerDied","Data":"ad83c225eb94fbee2d493f1a27a394e223bc52eeafcebdbaf5e013930fafdc13"} Apr 24 19:07:05.965042 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:05.965005 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:05.965240 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:05.965153 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:07:06.965183 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:06.965150 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:06.965631 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:06.965150 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:06.965631 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:06.965274 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:07:06.965631 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:06.965398 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:07:07.966625 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:07.966592 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:07.967092 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:07.966704 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:07:08.965269 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:08.965004 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:08.965486 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:08.965048 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:08.965486 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:08.965365 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7kd4p" podUID="127bc72d-f6ec-4cad-8e85-c35350f4ded7" Apr 24 19:07:08.965486 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:08.965467 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-dxx98" podUID="5bd705be-109f-4e28-aaaf-136053b1e384" Apr 24 19:07:09.965789 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:09.965751 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:09.966199 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:09.965889 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-qh6dj" podUID="71bb1727-5459-4ea6-ade3-0cf8e2a63038" Apr 24 19:07:10.659537 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.659452 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-142.ec2.internal" event="NodeReady" Apr 24 19:07:10.659689 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.659599 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 19:07:10.708831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.708784 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-77k7z"] Apr 24 19:07:10.713092 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.713063 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:10.714538 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.714323 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kn269"] Apr 24 19:07:10.716102 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.715826 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.716102 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.715826 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.716102 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.715865 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 19:07:10.716102 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.715826 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8pmm\"" Apr 24 19:07:10.717713 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.717690 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.719912 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.719896 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.720038 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.720024 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 19:07:10.720313 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.720299 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 19:07:10.720725 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.720705 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.721003 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.720989 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rh56f\"" Apr 24 19:07:10.724778 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.724388 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-77k7z"] Apr 24 19:07:10.726786 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.726766 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kn269"] Apr 24 19:07:10.792035 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.791923 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h549\" (UniqueName: \"kubernetes.io/projected/d274396b-9b0c-4277-a460-e66f42e9cf03-kube-api-access-4h549\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.792035 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.791984 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv226\" (UniqueName: \"kubernetes.io/projected/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-kube-api-access-pv226\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:10.792035 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.792015 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:10.792246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.792040 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.792246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.792141 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d274396b-9b0c-4277-a460-e66f42e9cf03-tmp-dir\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.792246 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.792185 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d274396b-9b0c-4277-a460-e66f42e9cf03-config-volume\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.815234 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.815180 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-ltbx5"] Apr 24 19:07:10.819211 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.819183 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.822281 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.822257 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fpmcl\"" Apr 24 19:07:10.893170 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893142 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d274396b-9b0c-4277-a460-e66f42e9cf03-config-volume\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.893316 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893182 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4h549\" (UniqueName: \"kubernetes.io/projected/d274396b-9b0c-4277-a460-e66f42e9cf03-kube-api-access-4h549\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.893316 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893210 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-hosts-file\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.893316 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893239 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pv226\" (UniqueName: \"kubernetes.io/projected/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-kube-api-access-pv226\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:10.893316 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893272 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spjz2\" (UniqueName: \"kubernetes.io/projected/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-kube-api-access-spjz2\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.893316 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-tmp-dir\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.893523 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893329 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:10.893523 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893390 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.893523 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:10.893453 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:10.893523 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:10.893515 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert podName:cfdbf694-67b0-45a2-848c-4bd1f8440a3d nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.393487811 +0000 UTC m=+34.031495795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert") pod "ingress-canary-77k7z" (UID: "cfdbf694-67b0-45a2-848c-4bd1f8440a3d") : secret "canary-serving-cert" not found Apr 24 19:07:10.893699 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:10.893560 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:10.893699 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:10.893629 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls podName:d274396b-9b0c-4277-a460-e66f42e9cf03 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.393595746 +0000 UTC m=+34.031603737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls") pod "dns-default-kn269" (UID: "d274396b-9b0c-4277-a460-e66f42e9cf03") : secret "dns-default-metrics-tls" not found Apr 24 19:07:10.893699 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893656 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d274396b-9b0c-4277-a460-e66f42e9cf03-tmp-dir\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.893807 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d274396b-9b0c-4277-a460-e66f42e9cf03-config-volume\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.893915 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.893898 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d274396b-9b0c-4277-a460-e66f42e9cf03-tmp-dir\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.909166 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.909140 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pv226\" (UniqueName: \"kubernetes.io/projected/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-kube-api-access-pv226\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:10.909166 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.909160 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h549\" (UniqueName: \"kubernetes.io/projected/d274396b-9b0c-4277-a460-e66f42e9cf03-kube-api-access-4h549\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:10.965758 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.965720 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:10.965931 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.965908 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:10.968788 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.968765 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:07:10.968873 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.968855 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-h4gdd\"" Apr 24 19:07:10.969109 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.969090 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:07:10.969176 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.969144 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:07:10.969254 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.969239 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjm4v\"" Apr 24 19:07:10.994568 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.994535 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-hosts-file\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.994703 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.994575 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-spjz2\" (UniqueName: \"kubernetes.io/projected/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-kube-api-access-spjz2\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.994703 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.994596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-tmp-dir\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.994703 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.994655 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-hosts-file\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:10.994917 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:10.994901 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-tmp-dir\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:11.007374 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.007346 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-spjz2\" (UniqueName: \"kubernetes.io/projected/1aec280c-4ff6-4d47-a34b-c89512f4e5f0-kube-api-access-spjz2\") pod \"node-resolver-ltbx5\" (UID: \"1aec280c-4ff6-4d47-a34b-c89512f4e5f0\") " pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:11.128039 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.128006 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-ltbx5" Apr 24 19:07:11.135741 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:11.135711 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aec280c_4ff6_4d47_a34b_c89512f4e5f0.slice/crio-975058afec505fbc56bc5ae791be5a130f5c9449c9c4588b69999cc887c23ac4 WatchSource:0}: Error finding container 975058afec505fbc56bc5ae791be5a130f5c9449c9c4588b69999cc887c23ac4: Status 404 returned error can't find the container with id 975058afec505fbc56bc5ae791be5a130f5c9449c9c4588b69999cc887c23ac4 Apr 24 19:07:11.139890 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.139865 2565 generic.go:358] "Generic (PLEG): container finished" podID="5998df72-185d-4c96-bec0-af611d2d2173" containerID="1914dc88f4b98c7e3ee55393b47e819e19c8a9efaf16eea5d38e32824bcbd78e" exitCode=0 Apr 24 19:07:11.140022 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.139894 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerDied","Data":"1914dc88f4b98c7e3ee55393b47e819e19c8a9efaf16eea5d38e32824bcbd78e"} Apr 24 19:07:11.140918 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.140900 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ltbx5" event={"ID":"1aec280c-4ff6-4d47-a34b-c89512f4e5f0","Type":"ContainerStarted","Data":"975058afec505fbc56bc5ae791be5a130f5c9449c9c4588b69999cc887c23ac4"} Apr 24 19:07:11.399837 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.399807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:11.400056 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.399846 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:11.400056 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:11.399956 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:11.400056 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:11.399962 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:11.400056 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:11.400041 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert podName:cfdbf694-67b0-45a2-848c-4bd1f8440a3d nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.400023124 +0000 UTC m=+35.038031103 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert") pod "ingress-canary-77k7z" (UID: "cfdbf694-67b0-45a2-848c-4bd1f8440a3d") : secret "canary-serving-cert" not found Apr 24 19:07:11.400241 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:11.400071 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls podName:d274396b-9b0c-4277-a460-e66f42e9cf03 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:12.400054268 +0000 UTC m=+35.038062247 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls") pod "dns-default-kn269" (UID: "d274396b-9b0c-4277-a460-e66f42e9cf03") : secret "dns-default-metrics-tls" not found Apr 24 19:07:11.600907 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.600873 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:11.601150 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:11.601028 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:07:11.601150 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:11.601086 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs podName:5bd705be-109f-4e28-aaaf-136053b1e384 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:43.60107201 +0000 UTC m=+66.239079989 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs") pod "network-metrics-daemon-dxx98" (UID: "5bd705be-109f-4e28-aaaf-136053b1e384") : secret "metrics-daemon-secret" not found Apr 24 19:07:11.701569 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.701536 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:11.704383 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.704358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnxpl\" (UniqueName: \"kubernetes.io/projected/127bc72d-f6ec-4cad-8e85-c35350f4ded7-kube-api-access-dnxpl\") pod \"network-check-target-7kd4p\" (UID: \"127bc72d-f6ec-4cad-8e85-c35350f4ded7\") " pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:11.879546 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.879453 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:11.965722 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.965498 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:11.969646 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:11.968732 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 19:07:12.056554 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.056526 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7kd4p"] Apr 24 19:07:12.060881 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:12.060854 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127bc72d_f6ec_4cad_8e85_c35350f4ded7.slice/crio-86f885897f7f060982e718ffaab57bae5df2044a50792c930f1f044a3618a135 WatchSource:0}: Error finding container 86f885897f7f060982e718ffaab57bae5df2044a50792c930f1f044a3618a135: Status 404 returned error can't find the container with id 86f885897f7f060982e718ffaab57bae5df2044a50792c930f1f044a3618a135 Apr 24 19:07:12.145717 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.145591 2565 generic.go:358] "Generic (PLEG): container finished" podID="5998df72-185d-4c96-bec0-af611d2d2173" containerID="303446bfc91694b4d7792d7bad0a68aacb5dfdc6ecc6dbebd625b5f1d1421153" exitCode=0 Apr 24 19:07:12.145717 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.145657 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerDied","Data":"303446bfc91694b4d7792d7bad0a68aacb5dfdc6ecc6dbebd625b5f1d1421153"} Apr 24 19:07:12.147055 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.147029 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-ltbx5" event={"ID":"1aec280c-4ff6-4d47-a34b-c89512f4e5f0","Type":"ContainerStarted","Data":"cb238fe3e5b4224330f46d2a5ce56bfd900638eddf1ad0f21671917339be7672"} Apr 24 19:07:12.147930 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.147913 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7kd4p" event={"ID":"127bc72d-f6ec-4cad-8e85-c35350f4ded7","Type":"ContainerStarted","Data":"86f885897f7f060982e718ffaab57bae5df2044a50792c930f1f044a3618a135"} Apr 24 19:07:12.186550 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.186499 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-ltbx5" podStartSLOduration=2.18648448 podStartE2EDuration="2.18648448s" podCreationTimestamp="2026-04-24 19:07:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:12.18589437 +0000 UTC m=+34.823902397" watchObservedRunningTime="2026-04-24 19:07:12.18648448 +0000 UTC m=+34.824492493" Apr 24 19:07:12.406621 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.406533 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:12.406621 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:12.406569 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:12.407019 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:12.406996 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:12.407105 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:12.407031 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:12.407105 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:12.407081 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls podName:d274396b-9b0c-4277-a460-e66f42e9cf03 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.407060846 +0000 UTC m=+37.045068842 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls") pod "dns-default-kn269" (UID: "d274396b-9b0c-4277-a460-e66f42e9cf03") : secret "dns-default-metrics-tls" not found Apr 24 19:07:12.407221 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:12.407128 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert podName:cfdbf694-67b0-45a2-848c-4bd1f8440a3d nodeName:}" failed. No retries permitted until 2026-04-24 19:07:14.407110754 +0000 UTC m=+37.045118753 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert") pod "ingress-canary-77k7z" (UID: "cfdbf694-67b0-45a2-848c-4bd1f8440a3d") : secret "canary-serving-cert" not found Apr 24 19:07:13.153938 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:13.153885 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" event={"ID":"5998df72-185d-4c96-bec0-af611d2d2173","Type":"ContainerStarted","Data":"9e3cc93cc59006f36d80cf86d3cf5434fb8eb81fae2ee84442cdfdd55d2b39e6"} Apr 24 19:07:13.179794 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:13.179742 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6zxkd" podStartSLOduration=5.084347342 podStartE2EDuration="35.179728538s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:06:40.57329216 +0000 UTC m=+3.211300142" lastFinishedPulling="2026-04-24 19:07:10.668673359 +0000 UTC m=+33.306681338" observedRunningTime="2026-04-24 19:07:13.17856446 +0000 UTC m=+35.816572474" watchObservedRunningTime="2026-04-24 19:07:13.179728538 +0000 UTC m=+35.817736539" Apr 24 19:07:14.019059 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:14.018786 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:14.023663 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:14.023624 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/71bb1727-5459-4ea6-ade3-0cf8e2a63038-original-pull-secret\") pod \"global-pull-secret-syncer-qh6dj\" (UID: \"71bb1727-5459-4ea6-ade3-0cf8e2a63038\") " pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:14.078419 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:14.078380 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-qh6dj" Apr 24 19:07:14.421205 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:14.421124 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:14.421205 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:14.421169 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:14.421733 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:14.421288 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:14.421733 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:14.421290 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:14.421733 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:14.421344 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls podName:d274396b-9b0c-4277-a460-e66f42e9cf03 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.421328938 +0000 UTC m=+41.059336917 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls") pod "dns-default-kn269" (UID: "d274396b-9b0c-4277-a460-e66f42e9cf03") : secret "dns-default-metrics-tls" not found Apr 24 19:07:14.421733 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:14.421362 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert podName:cfdbf694-67b0-45a2-848c-4bd1f8440a3d nodeName:}" failed. No retries permitted until 2026-04-24 19:07:18.421353314 +0000 UTC m=+41.059361293 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert") pod "ingress-canary-77k7z" (UID: "cfdbf694-67b0-45a2-848c-4bd1f8440a3d") : secret "canary-serving-cert" not found Apr 24 19:07:14.895679 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:14.895649 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-qh6dj"] Apr 24 19:07:15.003756 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:15.003724 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb1727_5459_4ea6_ade3_0cf8e2a63038.slice/crio-d0621f330b338d61e91340e344892ea7e6e9a8b104e2b336fad47f58e1a2a85c WatchSource:0}: Error finding container d0621f330b338d61e91340e344892ea7e6e9a8b104e2b336fad47f58e1a2a85c: Status 404 returned error can't find the container with id d0621f330b338d61e91340e344892ea7e6e9a8b104e2b336fad47f58e1a2a85c Apr 24 19:07:15.159274 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:15.159237 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7kd4p" event={"ID":"127bc72d-f6ec-4cad-8e85-c35350f4ded7","Type":"ContainerStarted","Data":"20af21f515b8ff4f1f63316fe147baa1a4f04e23e36ea704da3435d04b480140"} Apr 24 19:07:15.159458 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:15.159444 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:15.160401 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:15.160372 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qh6dj" event={"ID":"71bb1727-5459-4ea6-ade3-0cf8e2a63038","Type":"ContainerStarted","Data":"d0621f330b338d61e91340e344892ea7e6e9a8b104e2b336fad47f58e1a2a85c"} Apr 24 19:07:15.174955 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:15.174910 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7kd4p" podStartSLOduration=34.218669793 podStartE2EDuration="37.174896284s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:07:12.063323024 +0000 UTC m=+34.701331003" lastFinishedPulling="2026-04-24 19:07:15.0195495 +0000 UTC m=+37.657557494" observedRunningTime="2026-04-24 19:07:15.174227771 +0000 UTC m=+37.812235772" watchObservedRunningTime="2026-04-24 19:07:15.174896284 +0000 UTC m=+37.812904310" Apr 24 19:07:18.450437 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:18.450213 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:18.451019 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:18.450456 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:18.451019 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:18.450355 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:18.451019 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:18.450554 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:18.451019 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:18.450565 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert podName:cfdbf694-67b0-45a2-848c-4bd1f8440a3d nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.450546441 +0000 UTC m=+49.088554443 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert") pod "ingress-canary-77k7z" (UID: "cfdbf694-67b0-45a2-848c-4bd1f8440a3d") : secret "canary-serving-cert" not found Apr 24 19:07:18.451019 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:18.450589 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls podName:d274396b-9b0c-4277-a460-e66f42e9cf03 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:26.450580058 +0000 UTC m=+49.088588043 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls") pod "dns-default-kn269" (UID: "d274396b-9b0c-4277-a460-e66f42e9cf03") : secret "dns-default-metrics-tls" not found Apr 24 19:07:19.169418 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:19.169332 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-qh6dj" event={"ID":"71bb1727-5459-4ea6-ade3-0cf8e2a63038","Type":"ContainerStarted","Data":"3baeaa2f1f93924c090dbab7e615b15e123f0ee9301535862424861b38a7d2a8"} Apr 24 19:07:19.185389 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:19.185335 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-qh6dj" podStartSLOduration=33.415077477 podStartE2EDuration="37.18531935s" podCreationTimestamp="2026-04-24 19:06:42 +0000 UTC" firstStartedPulling="2026-04-24 19:07:15.009742207 +0000 UTC m=+37.647750186" lastFinishedPulling="2026-04-24 19:07:18.779984065 +0000 UTC m=+41.417992059" observedRunningTime="2026-04-24 19:07:19.184845209 +0000 UTC m=+41.822853221" watchObservedRunningTime="2026-04-24 19:07:19.18531935 +0000 UTC m=+41.823327342" Apr 24 19:07:24.588896 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.588858 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq"] Apr 24 19:07:24.595535 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.595510 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.598716 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.598692 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 19:07:24.598823 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.598712 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 19:07:24.598823 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.598788 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 19:07:24.599928 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.599912 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 19:07:24.600048 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.599939 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-c5rln\"" Apr 24 19:07:24.603719 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.603697 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq"] Apr 24 19:07:24.623935 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.623914 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5"] Apr 24 19:07:24.626371 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.626353 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.629091 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.629072 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 19:07:24.629185 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.629145 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 19:07:24.629604 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.629586 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 19:07:24.629873 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.629859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 19:07:24.640904 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.640886 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5"] Apr 24 19:07:24.694501 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.694468 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq\" (UID: \"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.694694 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.694573 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h85tw\" (UniqueName: \"kubernetes.io/projected/a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf-kube-api-access-h85tw\") pod \"managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq\" (UID: \"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.795225 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795184 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.795225 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795222 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5gdb\" (UniqueName: \"kubernetes.io/projected/18af177b-848c-4393-bbe6-78da1f7c1340-kube-api-access-w5gdb\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.795480 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795257 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-hub\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.795480 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795327 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h85tw\" (UniqueName: \"kubernetes.io/projected/a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf-kube-api-access-h85tw\") pod \"managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq\" (UID: \"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.795480 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795371 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.795480 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795422 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/18af177b-848c-4393-bbe6-78da1f7c1340-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.795480 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795468 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-ca\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.795704 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.795497 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq\" (UID: \"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.798621 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.798597 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq\" (UID: \"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.811808 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.811779 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h85tw\" (UniqueName: \"kubernetes.io/projected/a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf-kube-api-access-h85tw\") pod \"managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq\" (UID: \"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.896312 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.896226 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/18af177b-848c-4393-bbe6-78da1f7c1340-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.896312 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.896282 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-ca\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.896557 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.896328 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.896557 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.896421 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5gdb\" (UniqueName: \"kubernetes.io/projected/18af177b-848c-4393-bbe6-78da1f7c1340-kube-api-access-w5gdb\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.896557 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.896481 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-hub\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.896557 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.896506 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.897103 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.897071 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/18af177b-848c-4393-bbe6-78da1f7c1340-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.899318 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.899291 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-ca\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.899498 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.899477 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-hub\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.899556 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.899485 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.899556 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.899515 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/18af177b-848c-4393-bbe6-78da1f7c1340-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.913041 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.913012 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5gdb\" (UniqueName: \"kubernetes.io/projected/18af177b-848c-4393-bbe6-78da1f7c1340-kube-api-access-w5gdb\") pod \"cluster-proxy-proxy-agent-765846f77b-f7pp5\" (UID: \"18af177b-848c-4393-bbe6-78da1f7c1340\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.913169 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.913102 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" Apr 24 19:07:24.934130 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.934099 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:24.955951 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:24.955926 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ltbx5_1aec280c-4ff6-4d47-a34b-c89512f4e5f0/dns-node-resolver/0.log" Apr 24 19:07:25.040936 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:25.040904 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq"] Apr 24 19:07:25.044035 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:25.044003 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7e87cad_2aa4_4ba5_b9ac_30f33d40cfbf.slice/crio-aedc02726172a41b047dc0629c955ff6ec43189157bb3909c202a3f65cd26a0b WatchSource:0}: Error finding container aedc02726172a41b047dc0629c955ff6ec43189157bb3909c202a3f65cd26a0b: Status 404 returned error can't find the container with id aedc02726172a41b047dc0629c955ff6ec43189157bb3909c202a3f65cd26a0b Apr 24 19:07:25.066341 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:25.066313 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5"] Apr 24 19:07:25.073821 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:25.073791 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18af177b_848c_4393_bbe6_78da1f7c1340.slice/crio-a5353b6e99eb96d52ea67e31f4b317ccf0ee620c0c62f6cb7640463c551a6db1 WatchSource:0}: Error finding container a5353b6e99eb96d52ea67e31f4b317ccf0ee620c0c62f6cb7640463c551a6db1: Status 404 returned error can't find the container with id a5353b6e99eb96d52ea67e31f4b317ccf0ee620c0c62f6cb7640463c551a6db1 Apr 24 19:07:25.181861 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:25.181819 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" event={"ID":"18af177b-848c-4393-bbe6-78da1f7c1340","Type":"ContainerStarted","Data":"a5353b6e99eb96d52ea67e31f4b317ccf0ee620c0c62f6cb7640463c551a6db1"} Apr 24 19:07:25.182739 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:25.182718 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" event={"ID":"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf","Type":"ContainerStarted","Data":"aedc02726172a41b047dc0629c955ff6ec43189157bb3909c202a3f65cd26a0b"} Apr 24 19:07:25.756090 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:25.756012 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7kd45_281b0be6-1dcd-4e12-a8a1-7617ae9111e8/node-ca/0.log" Apr 24 19:07:26.509002 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:26.508938 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:26.509189 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:26.509009 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:26.509244 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:26.509213 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:26.509291 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:26.509278 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls podName:d274396b-9b0c-4277-a460-e66f42e9cf03 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.50925928 +0000 UTC m=+65.147267259 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls") pod "dns-default-kn269" (UID: "d274396b-9b0c-4277-a460-e66f42e9cf03") : secret "dns-default-metrics-tls" not found Apr 24 19:07:26.509711 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:26.509690 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:26.509789 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:26.509768 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert podName:cfdbf694-67b0-45a2-848c-4bd1f8440a3d nodeName:}" failed. No retries permitted until 2026-04-24 19:07:42.509751517 +0000 UTC m=+65.147759505 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert") pod "ingress-canary-77k7z" (UID: "cfdbf694-67b0-45a2-848c-4bd1f8440a3d") : secret "canary-serving-cert" not found Apr 24 19:07:29.193106 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:29.193069 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" event={"ID":"18af177b-848c-4393-bbe6-78da1f7c1340","Type":"ContainerStarted","Data":"7b9e949f179dc40334083004d9a223b58569ac96ef82e721c3e2f032ca3848e0"} Apr 24 19:07:29.194220 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:29.194198 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" event={"ID":"a7e87cad-2aa4-4ba5-b9ac-30f33d40cfbf","Type":"ContainerStarted","Data":"b15d6bb0471e47f8e3ccda279249ca077b4d0ccc1facd1bf4ad162001e9d27e1"} Apr 24 19:07:32.203284 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:32.203240 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" event={"ID":"18af177b-848c-4393-bbe6-78da1f7c1340","Type":"ContainerStarted","Data":"baa4fbd1ae80f679923a405ff23df252c13ffadc35717d25f3d01f7467b052d7"} Apr 24 19:07:32.203284 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:32.203278 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" event={"ID":"18af177b-848c-4393-bbe6-78da1f7c1340","Type":"ContainerStarted","Data":"44762af5510259ccd7bffdcb2edd24c64f595dd44261d90fbb99e5cde20d2b8f"} Apr 24 19:07:32.228402 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:32.228338 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-6fb6968dd7-jn9kq" podStartSLOduration=4.277063299 podStartE2EDuration="8.228318062s" podCreationTimestamp="2026-04-24 19:07:24 +0000 UTC" firstStartedPulling="2026-04-24 19:07:25.046154795 +0000 UTC m=+47.684162776" lastFinishedPulling="2026-04-24 19:07:28.997409561 +0000 UTC m=+51.635417539" observedRunningTime="2026-04-24 19:07:29.214654588 +0000 UTC m=+51.852662588" watchObservedRunningTime="2026-04-24 19:07:32.228318062 +0000 UTC m=+54.866326067" Apr 24 19:07:32.228620 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:32.228442 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" podStartSLOduration=1.77826753 podStartE2EDuration="8.228432722s" podCreationTimestamp="2026-04-24 19:07:24 +0000 UTC" firstStartedPulling="2026-04-24 19:07:25.075514119 +0000 UTC m=+47.713522098" lastFinishedPulling="2026-04-24 19:07:31.525679301 +0000 UTC m=+54.163687290" observedRunningTime="2026-04-24 19:07:32.226857927 +0000 UTC m=+54.864865927" watchObservedRunningTime="2026-04-24 19:07:32.228432722 +0000 UTC m=+54.866440724" Apr 24 19:07:34.936781 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:34.936714 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" podUID="18af177b-848c-4393-bbe6-78da1f7c1340" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 19:07:36.139494 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:36.139459 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5frl6" Apr 24 19:07:42.526411 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.526365 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:42.526411 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.526406 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:42.528787 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.528764 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d274396b-9b0c-4277-a460-e66f42e9cf03-metrics-tls\") pod \"dns-default-kn269\" (UID: \"d274396b-9b0c-4277-a460-e66f42e9cf03\") " pod="openshift-dns/dns-default-kn269" Apr 24 19:07:42.528893 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.528810 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfdbf694-67b0-45a2-848c-4bd1f8440a3d-cert\") pod \"ingress-canary-77k7z\" (UID: \"cfdbf694-67b0-45a2-848c-4bd1f8440a3d\") " pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:42.538186 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.538165 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-rh56f\"" Apr 24 19:07:42.546326 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.546308 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kn269" Apr 24 19:07:42.664263 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.664137 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kn269"] Apr 24 19:07:42.666631 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:42.666605 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd274396b_9b0c_4277_a460_e66f42e9cf03.slice/crio-ad2979e4846ac41acda2c5fb0a7b3809e4aa945b2e584cb8a4f2d3c064e8ea10 WatchSource:0}: Error finding container ad2979e4846ac41acda2c5fb0a7b3809e4aa945b2e584cb8a4f2d3c064e8ea10: Status 404 returned error can't find the container with id ad2979e4846ac41acda2c5fb0a7b3809e4aa945b2e584cb8a4f2d3c064e8ea10 Apr 24 19:07:42.831583 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.831498 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-l8pmm\"" Apr 24 19:07:42.838959 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.838932 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-77k7z" Apr 24 19:07:42.958322 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:42.958288 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-77k7z"] Apr 24 19:07:42.961082 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:42.961052 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfdbf694_67b0_45a2_848c_4bd1f8440a3d.slice/crio-8b682b44935b04e44a2471dd17edebf7209040f01de4209465897c98ddbeba46 WatchSource:0}: Error finding container 8b682b44935b04e44a2471dd17edebf7209040f01de4209465897c98ddbeba46: Status 404 returned error can't find the container with id 8b682b44935b04e44a2471dd17edebf7209040f01de4209465897c98ddbeba46 Apr 24 19:07:43.232540 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:43.232498 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-77k7z" event={"ID":"cfdbf694-67b0-45a2-848c-4bd1f8440a3d","Type":"ContainerStarted","Data":"8b682b44935b04e44a2471dd17edebf7209040f01de4209465897c98ddbeba46"} Apr 24 19:07:43.233538 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:43.233513 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kn269" event={"ID":"d274396b-9b0c-4277-a460-e66f42e9cf03","Type":"ContainerStarted","Data":"ad2979e4846ac41acda2c5fb0a7b3809e4aa945b2e584cb8a4f2d3c064e8ea10"} Apr 24 19:07:43.635034 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:43.634942 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:43.637490 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:43.637458 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5bd705be-109f-4e28-aaaf-136053b1e384-metrics-certs\") pod \"network-metrics-daemon-dxx98\" (UID: \"5bd705be-109f-4e28-aaaf-136053b1e384\") " pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:43.687402 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:43.687369 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-tjm4v\"" Apr 24 19:07:43.694673 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:43.694642 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-dxx98" Apr 24 19:07:43.835954 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:43.835896 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-dxx98"] Apr 24 19:07:44.241190 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:44.241148 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bd705be_109f_4e28_aaaf_136053b1e384.slice/crio-b62ab4553878bcb3360de2b2cb0098c570308039ba82c853ddb81df941bd3bf6 WatchSource:0}: Error finding container b62ab4553878bcb3360de2b2cb0098c570308039ba82c853ddb81df941bd3bf6: Status 404 returned error can't find the container with id b62ab4553878bcb3360de2b2cb0098c570308039ba82c853ddb81df941bd3bf6 Apr 24 19:07:44.936021 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:44.935956 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" podUID="18af177b-848c-4393-bbe6-78da1f7c1340" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 19:07:45.241244 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:45.241185 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kn269" event={"ID":"d274396b-9b0c-4277-a460-e66f42e9cf03","Type":"ContainerStarted","Data":"6fc393f754059a2946387208dc6e55d97cac8114f76d27a0a20b53a45580545b"} Apr 24 19:07:45.242398 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:45.242371 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dxx98" event={"ID":"5bd705be-109f-4e28-aaaf-136053b1e384","Type":"ContainerStarted","Data":"b62ab4553878bcb3360de2b2cb0098c570308039ba82c853ddb81df941bd3bf6"} Apr 24 19:07:45.243546 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:45.243521 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-77k7z" event={"ID":"cfdbf694-67b0-45a2-848c-4bd1f8440a3d","Type":"ContainerStarted","Data":"b989d71ae5ad9d39c9433e25e5fca36ff60a0e5362268d420e052b87970f5c14"} Apr 24 19:07:45.261268 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:45.260791 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-77k7z" podStartSLOduration=33.182993233 podStartE2EDuration="35.260773604s" podCreationTimestamp="2026-04-24 19:07:10 +0000 UTC" firstStartedPulling="2026-04-24 19:07:42.962912852 +0000 UTC m=+65.600920831" lastFinishedPulling="2026-04-24 19:07:45.040693221 +0000 UTC m=+67.678701202" observedRunningTime="2026-04-24 19:07:45.25984696 +0000 UTC m=+67.897854963" watchObservedRunningTime="2026-04-24 19:07:45.260773604 +0000 UTC m=+67.898781606" Apr 24 19:07:46.165487 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:46.165456 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7kd4p" Apr 24 19:07:46.249124 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:46.249086 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kn269" event={"ID":"d274396b-9b0c-4277-a460-e66f42e9cf03","Type":"ContainerStarted","Data":"59ccfa4dbe054cbe2431114650a36924c87aaf19320554f43c3ac07612bf2d4b"} Apr 24 19:07:46.249294 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:46.249194 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kn269" Apr 24 19:07:46.250736 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:46.250703 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dxx98" event={"ID":"5bd705be-109f-4e28-aaaf-136053b1e384","Type":"ContainerStarted","Data":"7614dcf6b848f2423f57d9ba1533cd88b0e8748110449d340287616165bd2f52"} Apr 24 19:07:46.273478 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:46.273427 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kn269" podStartSLOduration=33.908300798 podStartE2EDuration="36.273407704s" podCreationTimestamp="2026-04-24 19:07:10 +0000 UTC" firstStartedPulling="2026-04-24 19:07:42.668441943 +0000 UTC m=+65.306449924" lastFinishedPulling="2026-04-24 19:07:45.03354884 +0000 UTC m=+67.671556830" observedRunningTime="2026-04-24 19:07:46.272157566 +0000 UTC m=+68.910165567" watchObservedRunningTime="2026-04-24 19:07:46.273407704 +0000 UTC m=+68.911415707" Apr 24 19:07:47.255205 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.255167 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-dxx98" event={"ID":"5bd705be-109f-4e28-aaaf-136053b1e384","Type":"ContainerStarted","Data":"3d5594803c3ee57338440472f1291ebf51db97eb6d58c767f78fdc28367b45f8"} Apr 24 19:07:47.278490 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.278432 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-dxx98" podStartSLOduration=67.492872237 podStartE2EDuration="1m9.278418761s" podCreationTimestamp="2026-04-24 19:06:38 +0000 UTC" firstStartedPulling="2026-04-24 19:07:44.243055925 +0000 UTC m=+66.881063904" lastFinishedPulling="2026-04-24 19:07:46.028602443 +0000 UTC m=+68.666610428" observedRunningTime="2026-04-24 19:07:47.278223206 +0000 UTC m=+69.916231208" watchObservedRunningTime="2026-04-24 19:07:47.278418761 +0000 UTC m=+69.916426762" Apr 24 19:07:47.552487 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.552401 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6f8xt"] Apr 24 19:07:47.598736 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.598699 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6f8xt"] Apr 24 19:07:47.598874 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.598828 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.601898 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.601875 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 19:07:47.601898 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.601879 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 19:07:47.602306 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.602287 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 19:07:47.603076 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.603057 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-bjv5f\"" Apr 24 19:07:47.603166 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.603114 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 19:07:47.663671 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.663633 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd7f4efb-5956-404b-8513-49f0a6384dd1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.663671 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.663679 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd7f4efb-5956-404b-8513-49f0a6384dd1-crio-socket\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.663909 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.663696 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd7f4efb-5956-404b-8513-49f0a6384dd1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.663909 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.663766 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd7f4efb-5956-404b-8513-49f0a6384dd1-data-volume\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.663909 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.663811 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcf5x\" (UniqueName: \"kubernetes.io/projected/cd7f4efb-5956-404b-8513-49f0a6384dd1-kube-api-access-pcf5x\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.764313 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.764270 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd7f4efb-5956-404b-8513-49f0a6384dd1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.764313 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.764318 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd7f4efb-5956-404b-8513-49f0a6384dd1-crio-socket\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.764541 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.764339 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd7f4efb-5956-404b-8513-49f0a6384dd1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.764541 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.764370 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd7f4efb-5956-404b-8513-49f0a6384dd1-data-volume\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.764541 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.764397 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pcf5x\" (UniqueName: \"kubernetes.io/projected/cd7f4efb-5956-404b-8513-49f0a6384dd1-kube-api-access-pcf5x\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.764541 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.764523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/cd7f4efb-5956-404b-8513-49f0a6384dd1-crio-socket\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.764894 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.764872 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/cd7f4efb-5956-404b-8513-49f0a6384dd1-data-volume\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.765075 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.765059 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/cd7f4efb-5956-404b-8513-49f0a6384dd1-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.766774 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.766747 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/cd7f4efb-5956-404b-8513-49f0a6384dd1-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.773539 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.773513 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcf5x\" (UniqueName: \"kubernetes.io/projected/cd7f4efb-5956-404b-8513-49f0a6384dd1-kube-api-access-pcf5x\") pod \"insights-runtime-extractor-6f8xt\" (UID: \"cd7f4efb-5956-404b-8513-49f0a6384dd1\") " pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:47.907188 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:47.907093 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6f8xt" Apr 24 19:07:48.044306 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:48.044281 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6f8xt"] Apr 24 19:07:48.046839 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:48.046810 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7f4efb_5956_404b_8513_49f0a6384dd1.slice/crio-feabfbd3e6aee5f76364b249d0837533144aeea2029cbe8f8e68115de72341e6 WatchSource:0}: Error finding container feabfbd3e6aee5f76364b249d0837533144aeea2029cbe8f8e68115de72341e6: Status 404 returned error can't find the container with id feabfbd3e6aee5f76364b249d0837533144aeea2029cbe8f8e68115de72341e6 Apr 24 19:07:48.259230 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:48.259197 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6f8xt" event={"ID":"cd7f4efb-5956-404b-8513-49f0a6384dd1","Type":"ContainerStarted","Data":"f69cecf811f67c6e06aba25a1ccec9e869a82a021814641b760f6528036511cd"} Apr 24 19:07:48.259230 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:48.259235 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6f8xt" event={"ID":"cd7f4efb-5956-404b-8513-49f0a6384dd1","Type":"ContainerStarted","Data":"feabfbd3e6aee5f76364b249d0837533144aeea2029cbe8f8e68115de72341e6"} Apr 24 19:07:50.266020 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:50.265916 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6f8xt" event={"ID":"cd7f4efb-5956-404b-8513-49f0a6384dd1","Type":"ContainerStarted","Data":"09f196ec92abc605354fb679a8071bb4ddf753e29c4ae0a8a17780eda4792992"} Apr 24 19:07:51.676617 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.676542 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c9fc5b9c5-mm752"] Apr 24 19:07:51.679530 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.679513 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.683406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683267 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-d7g57\"" Apr 24 19:07:51.683406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683279 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 19:07:51.683406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683272 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 19:07:51.683406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683295 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 19:07:51.683406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683358 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 19:07:51.683406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683359 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 19:07:51.683406 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683402 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 19:07:51.683720 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.683662 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 19:07:51.690079 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.690057 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-service-ca\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.690199 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.690089 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-oauth-serving-cert\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.690199 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.690125 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-oauth-config\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.690199 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.690147 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-config\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.690329 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.690197 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-serving-cert\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.690329 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.690224 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9chs6\" (UniqueName: \"kubernetes.io/projected/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-kube-api-access-9chs6\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.695086 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.695067 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c9fc5b9c5-mm752"] Apr 24 19:07:51.791309 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.791268 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-serving-cert\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.791309 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.791307 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9chs6\" (UniqueName: \"kubernetes.io/projected/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-kube-api-access-9chs6\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.791546 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.791338 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-service-ca\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.791546 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.791362 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-oauth-serving-cert\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.791546 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.791397 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-oauth-config\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.791546 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.791422 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-config\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.792066 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.792044 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-service-ca\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.792170 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.792130 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-config\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.792300 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.792280 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-oauth-serving-cert\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.793798 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.793774 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-oauth-config\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.793904 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.793834 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-serving-cert\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.802394 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.802367 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9chs6\" (UniqueName: \"kubernetes.io/projected/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-kube-api-access-9chs6\") pod \"console-c9fc5b9c5-mm752\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.831093 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.831065 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rhjv8"] Apr 24 19:07:51.835049 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.835029 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.837666 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.837648 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 19:07:51.838025 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.838002 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-45jgt\"" Apr 24 19:07:51.838153 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.838040 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 19:07:51.838153 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.838067 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 19:07:51.838153 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.838075 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 19:07:51.838153 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.838150 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 19:07:51.844252 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.844232 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rhjv8"] Apr 24 19:07:51.892614 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.892574 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/806d34d7-6f49-489a-8b8f-c6ae79b29adb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.892614 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.892611 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.892820 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.892634 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.892820 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.892724 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjhgz\" (UniqueName: \"kubernetes.io/projected/806d34d7-6f49-489a-8b8f-c6ae79b29adb-kube-api-access-zjhgz\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.988361 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.988330 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:07:51.993216 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.993189 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/806d34d7-6f49-489a-8b8f-c6ae79b29adb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.993332 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.993223 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.993332 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.993249 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.993332 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.993293 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjhgz\" (UniqueName: \"kubernetes.io/projected/806d34d7-6f49-489a-8b8f-c6ae79b29adb-kube-api-access-zjhgz\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.993504 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:51.993364 2565 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 24 19:07:51.993504 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:07:51.993442 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-tls podName:806d34d7-6f49-489a-8b8f-c6ae79b29adb nodeName:}" failed. No retries permitted until 2026-04-24 19:07:52.49342161 +0000 UTC m=+75.131429592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-rhjv8" (UID: "806d34d7-6f49-489a-8b8f-c6ae79b29adb") : secret "prometheus-operator-tls" not found Apr 24 19:07:51.993889 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.993868 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/806d34d7-6f49-489a-8b8f-c6ae79b29adb-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:51.995906 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:51.995736 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:52.005651 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.005619 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjhgz\" (UniqueName: \"kubernetes.io/projected/806d34d7-6f49-489a-8b8f-c6ae79b29adb-kube-api-access-zjhgz\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:52.107813 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.107780 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c9fc5b9c5-mm752"] Apr 24 19:07:52.112647 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:52.112616 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974d0d2e_a0ad_4b68_aff3_35d352d38ea2.slice/crio-42438d38bb823ca370a88bc0438a3edc6fb296b7e68e7b2d4a56972597460664 WatchSource:0}: Error finding container 42438d38bb823ca370a88bc0438a3edc6fb296b7e68e7b2d4a56972597460664: Status 404 returned error can't find the container with id 42438d38bb823ca370a88bc0438a3edc6fb296b7e68e7b2d4a56972597460664 Apr 24 19:07:52.271959 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.271852 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9fc5b9c5-mm752" event={"ID":"974d0d2e-a0ad-4b68-aff3-35d352d38ea2","Type":"ContainerStarted","Data":"42438d38bb823ca370a88bc0438a3edc6fb296b7e68e7b2d4a56972597460664"} Apr 24 19:07:52.273663 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.273632 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6f8xt" event={"ID":"cd7f4efb-5956-404b-8513-49f0a6384dd1","Type":"ContainerStarted","Data":"2e1b3c58a2f3ba40778f15fc6dfe7504a457cf83bd3a016a82dc1808a131e15d"} Apr 24 19:07:52.292773 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.292722 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6f8xt" podStartSLOduration=2.191457608 podStartE2EDuration="5.292708084s" podCreationTimestamp="2026-04-24 19:07:47 +0000 UTC" firstStartedPulling="2026-04-24 19:07:48.174049589 +0000 UTC m=+70.812057568" lastFinishedPulling="2026-04-24 19:07:51.275300051 +0000 UTC m=+73.913308044" observedRunningTime="2026-04-24 19:07:52.291052778 +0000 UTC m=+74.929060779" watchObservedRunningTime="2026-04-24 19:07:52.292708084 +0000 UTC m=+74.930716085" Apr 24 19:07:52.498399 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.498343 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:52.501313 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.501281 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/806d34d7-6f49-489a-8b8f-c6ae79b29adb-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-rhjv8\" (UID: \"806d34d7-6f49-489a-8b8f-c6ae79b29adb\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:52.744857 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.744825 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" Apr 24 19:07:52.898248 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:52.898213 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-rhjv8"] Apr 24 19:07:52.901788 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:52.901757 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod806d34d7_6f49_489a_8b8f_c6ae79b29adb.slice/crio-8daf84b2be96dfe618fd23862cd24b0962ab09dcd87efeb67cfb23419409d096 WatchSource:0}: Error finding container 8daf84b2be96dfe618fd23862cd24b0962ab09dcd87efeb67cfb23419409d096: Status 404 returned error can't find the container with id 8daf84b2be96dfe618fd23862cd24b0962ab09dcd87efeb67cfb23419409d096 Apr 24 19:07:53.277781 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:53.277742 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" event={"ID":"806d34d7-6f49-489a-8b8f-c6ae79b29adb","Type":"ContainerStarted","Data":"8daf84b2be96dfe618fd23862cd24b0962ab09dcd87efeb67cfb23419409d096"} Apr 24 19:07:54.935332 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:54.935289 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" podUID="18af177b-848c-4393-bbe6-78da1f7c1340" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 19:07:54.935683 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:54.935359 2565 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" Apr 24 19:07:54.935840 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:54.935810 2565 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"baa4fbd1ae80f679923a405ff23df252c13ffadc35717d25f3d01f7467b052d7"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 19:07:54.935886 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:54.935871 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" podUID="18af177b-848c-4393-bbe6-78da1f7c1340" containerName="service-proxy" containerID="cri-o://baa4fbd1ae80f679923a405ff23df252c13ffadc35717d25f3d01f7467b052d7" gracePeriod=30 Apr 24 19:07:55.285230 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:55.285138 2565 generic.go:358] "Generic (PLEG): container finished" podID="18af177b-848c-4393-bbe6-78da1f7c1340" containerID="baa4fbd1ae80f679923a405ff23df252c13ffadc35717d25f3d01f7467b052d7" exitCode=2 Apr 24 19:07:55.285230 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:55.285202 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" event={"ID":"18af177b-848c-4393-bbe6-78da1f7c1340","Type":"ContainerDied","Data":"baa4fbd1ae80f679923a405ff23df252c13ffadc35717d25f3d01f7467b052d7"} Apr 24 19:07:55.285422 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:55.285235 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-765846f77b-f7pp5" event={"ID":"18af177b-848c-4393-bbe6-78da1f7c1340","Type":"ContainerStarted","Data":"8f84f564fc0318fa3ad888ee81adb7089a415b052fecb2549cffc6478bb52b26"} Apr 24 19:07:55.286646 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:55.286611 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9fc5b9c5-mm752" event={"ID":"974d0d2e-a0ad-4b68-aff3-35d352d38ea2","Type":"ContainerStarted","Data":"167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2"} Apr 24 19:07:55.288203 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:55.288177 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" event={"ID":"806d34d7-6f49-489a-8b8f-c6ae79b29adb","Type":"ContainerStarted","Data":"6a035d2598d8850cfae3f67a241f8a53bed91b2cf731cc2bbdc187fddae92b21"} Apr 24 19:07:55.288203 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:55.288206 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" event={"ID":"806d34d7-6f49-489a-8b8f-c6ae79b29adb","Type":"ContainerStarted","Data":"2ca56a3bde491d4d2f9f3771f4bbd78aaf3189aabc67b42defbbd0b33b502138"} Apr 24 19:07:55.354249 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:55.354194 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-rhjv8" podStartSLOduration=2.258629005 podStartE2EDuration="4.354178801s" podCreationTimestamp="2026-04-24 19:07:51 +0000 UTC" firstStartedPulling="2026-04-24 19:07:52.903936711 +0000 UTC m=+75.541944690" lastFinishedPulling="2026-04-24 19:07:54.999486493 +0000 UTC m=+77.637494486" observedRunningTime="2026-04-24 19:07:55.353148942 +0000 UTC m=+77.991156941" watchObservedRunningTime="2026-04-24 19:07:55.354178801 +0000 UTC m=+77.992186801" Apr 24 19:07:56.257390 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:56.257358 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kn269" Apr 24 19:07:56.279317 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:56.279264 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c9fc5b9c5-mm752" podStartSLOduration=2.394399255 podStartE2EDuration="5.279246133s" podCreationTimestamp="2026-04-24 19:07:51 +0000 UTC" firstStartedPulling="2026-04-24 19:07:52.114542426 +0000 UTC m=+74.752550406" lastFinishedPulling="2026-04-24 19:07:54.999389304 +0000 UTC m=+77.637397284" observedRunningTime="2026-04-24 19:07:55.371512186 +0000 UTC m=+78.009520187" watchObservedRunningTime="2026-04-24 19:07:56.279246133 +0000 UTC m=+78.917254128" Apr 24 19:07:57.333587 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.333547 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-nxnw4"] Apr 24 19:07:57.337152 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.337130 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.343869 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.343822 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 19:07:57.343999 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.343942 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2nlmz\"" Apr 24 19:07:57.344945 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.344929 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 19:07:57.345037 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.344960 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 19:07:57.437643 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437611 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-textfile\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.437643 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437647 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.437839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437679 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-root\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.437839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437737 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-sys\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.437839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437754 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-tls\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.437839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437783 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-wtmp\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.437839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437813 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.437839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437828 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-metrics-client-ca\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.438052 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.437851 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krp28\" (UniqueName: \"kubernetes.io/projected/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-kube-api-access-krp28\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538569 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538526 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-sys\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538569 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538570 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-tls\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538822 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-wtmp\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538822 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538635 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538822 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538649 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-sys\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538822 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538730 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-metrics-client-ca\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538822 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538774 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krp28\" (UniqueName: \"kubernetes.io/projected/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-kube-api-access-krp28\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538822 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538790 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-wtmp\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.538822 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538801 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-textfile\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.539200 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538844 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.539200 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.538911 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-root\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.539200 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.539000 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-root\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.539345 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.539210 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-textfile\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.539405 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.539350 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-metrics-client-ca\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.539530 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.539502 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-accelerators-collector-config\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.541001 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.540962 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.541162 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.541137 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-node-exporter-tls\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.547351 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.547329 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krp28\" (UniqueName: \"kubernetes.io/projected/d9dc8854-e8d8-4f1d-a016-3d7507cdf28e-kube-api-access-krp28\") pod \"node-exporter-nxnw4\" (UID: \"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e\") " pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.646102 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:57.646023 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-nxnw4" Apr 24 19:07:57.654778 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:07:57.654735 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9dc8854_e8d8_4f1d_a016_3d7507cdf28e.slice/crio-f81fe229057070d7d3fc7571cd1cd855778622e97378b52c06197b3b0f67bab8 WatchSource:0}: Error finding container f81fe229057070d7d3fc7571cd1cd855778622e97378b52c06197b3b0f67bab8: Status 404 returned error can't find the container with id f81fe229057070d7d3fc7571cd1cd855778622e97378b52c06197b3b0f67bab8 Apr 24 19:07:58.297669 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:58.297618 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxnw4" event={"ID":"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e","Type":"ContainerStarted","Data":"f81fe229057070d7d3fc7571cd1cd855778622e97378b52c06197b3b0f67bab8"} Apr 24 19:07:59.302095 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:59.302059 2565 generic.go:358] "Generic (PLEG): container finished" podID="d9dc8854-e8d8-4f1d-a016-3d7507cdf28e" containerID="5824bca3909170e4a58a2954115ab9eb08759b57e2b5b3b492b1c5770432a1c2" exitCode=0 Apr 24 19:07:59.302479 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:07:59.302149 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxnw4" event={"ID":"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e","Type":"ContainerDied","Data":"5824bca3909170e4a58a2954115ab9eb08759b57e2b5b3b492b1c5770432a1c2"} Apr 24 19:08:00.306390 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:00.306350 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxnw4" event={"ID":"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e","Type":"ContainerStarted","Data":"fff651a53aec4104d4d391d5c689d7195389d135f7baf24359aa5ac8bfec1d03"} Apr 24 19:08:00.306390 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:00.306391 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-nxnw4" event={"ID":"d9dc8854-e8d8-4f1d-a016-3d7507cdf28e","Type":"ContainerStarted","Data":"9c41d6c5ebfb08c587dd48b3a15b96c24d5828594e6adf592150cd69c212ee8b"} Apr 24 19:08:00.336849 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:00.336795 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-nxnw4" podStartSLOduration=2.617550938 podStartE2EDuration="3.336780659s" podCreationTimestamp="2026-04-24 19:07:57 +0000 UTC" firstStartedPulling="2026-04-24 19:07:57.656850845 +0000 UTC m=+80.294858830" lastFinishedPulling="2026-04-24 19:07:58.376080555 +0000 UTC m=+81.014088551" observedRunningTime="2026-04-24 19:08:00.33483669 +0000 UTC m=+82.972844691" watchObservedRunningTime="2026-04-24 19:08:00.336780659 +0000 UTC m=+82.974788659" Apr 24 19:08:01.989244 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:01.989202 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:08:01.989244 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:01.989245 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:08:01.994281 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:01.994254 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:08:02.315641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:02.315554 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:08:19.069099 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:19.069063 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c9fc5b9c5-mm752"] Apr 24 19:08:44.091017 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.090944 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c9fc5b9c5-mm752" podUID="974d0d2e-a0ad-4b68-aff3-35d352d38ea2" containerName="console" containerID="cri-o://167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2" gracePeriod=15 Apr 24 19:08:44.340431 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.340409 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c9fc5b9c5-mm752_974d0d2e-a0ad-4b68-aff3-35d352d38ea2/console/0.log" Apr 24 19:08:44.340549 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.340482 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:08:44.419614 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.419538 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c9fc5b9c5-mm752_974d0d2e-a0ad-4b68-aff3-35d352d38ea2/console/0.log" Apr 24 19:08:44.419614 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.419580 2565 generic.go:358] "Generic (PLEG): container finished" podID="974d0d2e-a0ad-4b68-aff3-35d352d38ea2" containerID="167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2" exitCode=2 Apr 24 19:08:44.419802 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.419611 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9fc5b9c5-mm752" event={"ID":"974d0d2e-a0ad-4b68-aff3-35d352d38ea2","Type":"ContainerDied","Data":"167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2"} Apr 24 19:08:44.419802 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.419651 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c9fc5b9c5-mm752" event={"ID":"974d0d2e-a0ad-4b68-aff3-35d352d38ea2","Type":"ContainerDied","Data":"42438d38bb823ca370a88bc0438a3edc6fb296b7e68e7b2d4a56972597460664"} Apr 24 19:08:44.419802 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.419651 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c9fc5b9c5-mm752" Apr 24 19:08:44.419802 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.419662 2565 scope.go:117] "RemoveContainer" containerID="167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2" Apr 24 19:08:44.427388 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.427367 2565 scope.go:117] "RemoveContainer" containerID="167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2" Apr 24 19:08:44.427630 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:08:44.427612 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2\": container with ID starting with 167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2 not found: ID does not exist" containerID="167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2" Apr 24 19:08:44.427671 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.427639 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2"} err="failed to get container status \"167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2\": rpc error: code = NotFound desc = could not find container \"167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2\": container with ID starting with 167b6cd9930538189f3f40ae204abad5178da13d33f0d2add5a6dc066eae61d2 not found: ID does not exist" Apr 24 19:08:44.477445 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477410 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-oauth-serving-cert\") pod \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " Apr 24 19:08:44.477445 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477450 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-config\") pod \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " Apr 24 19:08:44.477625 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477493 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-serving-cert\") pod \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " Apr 24 19:08:44.477625 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477517 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-service-ca\") pod \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " Apr 24 19:08:44.477625 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477549 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9chs6\" (UniqueName: \"kubernetes.io/projected/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-kube-api-access-9chs6\") pod \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " Apr 24 19:08:44.477625 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477581 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-oauth-config\") pod \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\" (UID: \"974d0d2e-a0ad-4b68-aff3-35d352d38ea2\") " Apr 24 19:08:44.477992 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477948 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "974d0d2e-a0ad-4b68-aff3-35d352d38ea2" (UID: "974d0d2e-a0ad-4b68-aff3-35d352d38ea2"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:44.478089 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477957 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-service-ca" (OuterVolumeSpecName: "service-ca") pod "974d0d2e-a0ad-4b68-aff3-35d352d38ea2" (UID: "974d0d2e-a0ad-4b68-aff3-35d352d38ea2"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:44.478089 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.477966 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-config" (OuterVolumeSpecName: "console-config") pod "974d0d2e-a0ad-4b68-aff3-35d352d38ea2" (UID: "974d0d2e-a0ad-4b68-aff3-35d352d38ea2"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:08:44.479831 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.479802 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "974d0d2e-a0ad-4b68-aff3-35d352d38ea2" (UID: "974d0d2e-a0ad-4b68-aff3-35d352d38ea2"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:44.479924 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.479845 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-kube-api-access-9chs6" (OuterVolumeSpecName: "kube-api-access-9chs6") pod "974d0d2e-a0ad-4b68-aff3-35d352d38ea2" (UID: "974d0d2e-a0ad-4b68-aff3-35d352d38ea2"). InnerVolumeSpecName "kube-api-access-9chs6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:08:44.479924 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.479843 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "974d0d2e-a0ad-4b68-aff3-35d352d38ea2" (UID: "974d0d2e-a0ad-4b68-aff3-35d352d38ea2"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:08:44.578369 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.578321 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9chs6\" (UniqueName: \"kubernetes.io/projected/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-kube-api-access-9chs6\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:08:44.578369 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.578364 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-oauth-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:08:44.578369 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.578374 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-oauth-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:08:44.578369 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.578385 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-config\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:08:44.578632 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.578395 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-console-serving-cert\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:08:44.578632 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.578404 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/974d0d2e-a0ad-4b68-aff3-35d352d38ea2-service-ca\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:08:44.740444 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.740416 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c9fc5b9c5-mm752"] Apr 24 19:08:44.744390 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:44.744360 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c9fc5b9c5-mm752"] Apr 24 19:08:45.969175 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:08:45.969131 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974d0d2e-a0ad-4b68-aff3-35d352d38ea2" path="/var/lib/kubelet/pods/974d0d2e-a0ad-4b68-aff3-35d352d38ea2/volumes" Apr 24 19:11:21.562429 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.562395 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6"] Apr 24 19:11:21.562872 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.562619 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="974d0d2e-a0ad-4b68-aff3-35d352d38ea2" containerName="console" Apr 24 19:11:21.562872 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.562630 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="974d0d2e-a0ad-4b68-aff3-35d352d38ea2" containerName="console" Apr 24 19:11:21.562872 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.562669 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="974d0d2e-a0ad-4b68-aff3-35d352d38ea2" containerName="console" Apr 24 19:11:21.565393 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.565374 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.568065 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.568042 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-kq9p6\"" Apr 24 19:11:21.568165 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.568055 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 24 19:11:21.569213 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.569195 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 24 19:11:21.574273 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.574249 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6"] Apr 24 19:11:21.641933 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.641899 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.642127 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.641951 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s8z4\" (UniqueName: \"kubernetes.io/projected/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-kube-api-access-5s8z4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.642127 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.642049 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.742453 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.742401 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.742453 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.742466 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5s8z4\" (UniqueName: \"kubernetes.io/projected/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-kube-api-access-5s8z4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.742634 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.742499 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.742796 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.742775 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.742839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.742802 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.751989 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.751941 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s8z4\" (UniqueName: \"kubernetes.io/projected/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-kube-api-access-5s8z4\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.874839 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.874751 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:21.994022 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:21.993998 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6"] Apr 24 19:11:21.996516 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:11:21.996489 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cf568e_9113_4f4e_9c60_cfcce6cddf9a.slice/crio-52bedaba04798b7332ddc8eecd07551c9094d0b498fb426d7ad4593dae5e2247 WatchSource:0}: Error finding container 52bedaba04798b7332ddc8eecd07551c9094d0b498fb426d7ad4593dae5e2247: Status 404 returned error can't find the container with id 52bedaba04798b7332ddc8eecd07551c9094d0b498fb426d7ad4593dae5e2247 Apr 24 19:11:22.816292 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:22.816246 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" event={"ID":"86cf568e-9113-4f4e-9c60-cfcce6cddf9a","Type":"ContainerStarted","Data":"52bedaba04798b7332ddc8eecd07551c9094d0b498fb426d7ad4593dae5e2247"} Apr 24 19:11:28.833808 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:28.833770 2565 generic.go:358] "Generic (PLEG): container finished" podID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerID="0de6bb7c2fbd4ed7276aeeecb2f7fdb75ab67f681b50961d504211a2d783765b" exitCode=0 Apr 24 19:11:28.834203 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:28.833858 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" event={"ID":"86cf568e-9113-4f4e-9c60-cfcce6cddf9a","Type":"ContainerDied","Data":"0de6bb7c2fbd4ed7276aeeecb2f7fdb75ab67f681b50961d504211a2d783765b"} Apr 24 19:11:30.842110 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:30.842017 2565 generic.go:358] "Generic (PLEG): container finished" podID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerID="9c5d1f00d2701291b7f37195153d36292a4866c21483694ead68c890727ed8e0" exitCode=0 Apr 24 19:11:30.842110 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:30.842066 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" event={"ID":"86cf568e-9113-4f4e-9c60-cfcce6cddf9a","Type":"ContainerDied","Data":"9c5d1f00d2701291b7f37195153d36292a4866c21483694ead68c890727ed8e0"} Apr 24 19:11:37.859177 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:37.859151 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:11:37.859581 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:37.859300 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:11:37.862111 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:37.862094 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 19:11:37.862295 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:37.862272 2565 generic.go:358] "Generic (PLEG): container finished" podID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerID="62c35c10ca9c7a58f797be4b3b99d67fbffa0c3b8a41473c77e06f67bd333004" exitCode=0 Apr 24 19:11:37.862349 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:37.862314 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" event={"ID":"86cf568e-9113-4f4e-9c60-cfcce6cddf9a","Type":"ContainerDied","Data":"62c35c10ca9c7a58f797be4b3b99d67fbffa0c3b8a41473c77e06f67bd333004"} Apr 24 19:11:38.980676 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:38.980655 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:39.074742 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.074695 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-bundle\") pod \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " Apr 24 19:11:39.074742 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.074760 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-util\") pod \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " Apr 24 19:11:39.075031 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.074786 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s8z4\" (UniqueName: \"kubernetes.io/projected/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-kube-api-access-5s8z4\") pod \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\" (UID: \"86cf568e-9113-4f4e-9c60-cfcce6cddf9a\") " Apr 24 19:11:39.075294 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.075270 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-bundle" (OuterVolumeSpecName: "bundle") pod "86cf568e-9113-4f4e-9c60-cfcce6cddf9a" (UID: "86cf568e-9113-4f4e-9c60-cfcce6cddf9a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:11:39.077012 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.076961 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-kube-api-access-5s8z4" (OuterVolumeSpecName: "kube-api-access-5s8z4") pod "86cf568e-9113-4f4e-9c60-cfcce6cddf9a" (UID: "86cf568e-9113-4f4e-9c60-cfcce6cddf9a"). InnerVolumeSpecName "kube-api-access-5s8z4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:11:39.079000 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.078963 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-util" (OuterVolumeSpecName: "util") pod "86cf568e-9113-4f4e-9c60-cfcce6cddf9a" (UID: "86cf568e-9113-4f4e-9c60-cfcce6cddf9a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:11:39.176079 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.175964 2565 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-bundle\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:11:39.176079 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.176024 2565 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-util\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:11:39.176079 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.176036 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5s8z4\" (UniqueName: \"kubernetes.io/projected/86cf568e-9113-4f4e-9c60-cfcce6cddf9a-kube-api-access-5s8z4\") on node \"ip-10-0-138-142.ec2.internal\" DevicePath \"\"" Apr 24 19:11:39.869416 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.869393 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" Apr 24 19:11:39.869587 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.869394 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29c7zqp6" event={"ID":"86cf568e-9113-4f4e-9c60-cfcce6cddf9a","Type":"ContainerDied","Data":"52bedaba04798b7332ddc8eecd07551c9094d0b498fb426d7ad4593dae5e2247"} Apr 24 19:11:39.869587 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:39.869491 2565 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52bedaba04798b7332ddc8eecd07551c9094d0b498fb426d7ad4593dae5e2247" Apr 24 19:11:43.183677 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183637 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6"] Apr 24 19:11:43.184069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183876 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerName="util" Apr 24 19:11:43.184069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183887 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerName="util" Apr 24 19:11:43.184069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183896 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerName="extract" Apr 24 19:11:43.184069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183901 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerName="extract" Apr 24 19:11:43.184069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183909 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerName="pull" Apr 24 19:11:43.184069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183915 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerName="pull" Apr 24 19:11:43.184069 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.183966 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="86cf568e-9113-4f4e-9c60-cfcce6cddf9a" containerName="extract" Apr 24 19:11:43.185648 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.185632 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.188125 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.188098 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 24 19:11:43.188251 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.188145 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-9dpdq\"" Apr 24 19:11:43.188251 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.188209 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 19:11:43.188360 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.188207 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 19:11:43.197855 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.197663 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6"] Apr 24 19:11:43.305405 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.305364 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5hfg\" (UniqueName: \"kubernetes.io/projected/e8b75e4b-fb44-4abe-a8b6-f290449d49ec-kube-api-access-n5hfg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6\" (UID: \"e8b75e4b-fb44-4abe-a8b6-f290449d49ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.305405 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.305405 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e8b75e4b-fb44-4abe-a8b6-f290449d49ec-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6\" (UID: \"e8b75e4b-fb44-4abe-a8b6-f290449d49ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.405730 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.405700 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5hfg\" (UniqueName: \"kubernetes.io/projected/e8b75e4b-fb44-4abe-a8b6-f290449d49ec-kube-api-access-n5hfg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6\" (UID: \"e8b75e4b-fb44-4abe-a8b6-f290449d49ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.405842 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.405738 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e8b75e4b-fb44-4abe-a8b6-f290449d49ec-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6\" (UID: \"e8b75e4b-fb44-4abe-a8b6-f290449d49ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.408041 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.408017 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/e8b75e4b-fb44-4abe-a8b6-f290449d49ec-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6\" (UID: \"e8b75e4b-fb44-4abe-a8b6-f290449d49ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.414249 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.414230 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5hfg\" (UniqueName: \"kubernetes.io/projected/e8b75e4b-fb44-4abe-a8b6-f290449d49ec-kube-api-access-n5hfg\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6\" (UID: \"e8b75e4b-fb44-4abe-a8b6-f290449d49ec\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.495640 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.495550 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:43.621713 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.621687 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6"] Apr 24 19:11:43.624273 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:11:43.624234 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8b75e4b_fb44_4abe_a8b6_f290449d49ec.slice/crio-c2ac0fec458fb248ee21a5b02edce038ad18f2a06fc5fc6fe9d51d848f039b1c WatchSource:0}: Error finding container c2ac0fec458fb248ee21a5b02edce038ad18f2a06fc5fc6fe9d51d848f039b1c: Status 404 returned error can't find the container with id c2ac0fec458fb248ee21a5b02edce038ad18f2a06fc5fc6fe9d51d848f039b1c Apr 24 19:11:43.625865 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.625849 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:11:43.882854 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:43.882767 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" event={"ID":"e8b75e4b-fb44-4abe-a8b6-f290449d49ec","Type":"ContainerStarted","Data":"c2ac0fec458fb248ee21a5b02edce038ad18f2a06fc5fc6fe9d51d848f039b1c"} Apr 24 19:11:47.670358 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.670278 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-7crwv"] Apr 24 19:11:47.672732 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.672714 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.675165 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.675134 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 19:11:47.675165 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.675159 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 24 19:11:47.675350 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.675229 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-2p5z7\"" Apr 24 19:11:47.688461 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.688437 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-7crwv"] Apr 24 19:11:47.840916 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.840881 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.840916 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.840914 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtj4\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-kube-api-access-4wtj4\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.841131 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.840939 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-cabundle0\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.895387 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.895352 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" event={"ID":"e8b75e4b-fb44-4abe-a8b6-f290449d49ec","Type":"ContainerStarted","Data":"ba099751a1d7f32adaf53fc9d70ba0c40fb74f71fdf48a92b7e1ef23c13f2cea"} Apr 24 19:11:47.895556 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.895465 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:11:47.916710 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.916650 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" podStartSLOduration=1.356222643 podStartE2EDuration="4.916629816s" podCreationTimestamp="2026-04-24 19:11:43 +0000 UTC" firstStartedPulling="2026-04-24 19:11:43.625986416 +0000 UTC m=+306.263994407" lastFinishedPulling="2026-04-24 19:11:47.186393597 +0000 UTC m=+309.824401580" observedRunningTime="2026-04-24 19:11:47.914500372 +0000 UTC m=+310.552508375" watchObservedRunningTime="2026-04-24 19:11:47.916629816 +0000 UTC m=+310.554637818" Apr 24 19:11:47.942104 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.942076 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.942104 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.942108 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtj4\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-kube-api-access-4wtj4\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.942302 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.942135 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-cabundle0\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.942302 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:47.942235 2565 secret.go:281] references non-existent secret key: ca.crt Apr 24 19:11:47.942302 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:47.942252 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 19:11:47.942302 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:47.942261 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-7crwv: references non-existent secret key: ca.crt Apr 24 19:11:47.942428 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:47.942316 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates podName:3a3c28d3-cf6a-41a3-832c-5f39aad4d155 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:48.442299129 +0000 UTC m=+311.080307108 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates") pod "keda-operator-ffbb595cb-7crwv" (UID: "3a3c28d3-cf6a-41a3-832c-5f39aad4d155") : references non-existent secret key: ca.crt Apr 24 19:11:47.942699 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.942683 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-cabundle0\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:47.952731 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:47.952707 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtj4\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-kube-api-access-4wtj4\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:48.047805 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.047775 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc"] Apr 24 19:11:48.052436 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.050638 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.055066 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.055036 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 24 19:11:48.060078 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.060056 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc"] Apr 24 19:11:48.143763 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.143728 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxd6c\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-kube-api-access-rxd6c\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.143938 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.143804 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.143938 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.143827 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.244457 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.244364 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxd6c\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-kube-api-access-rxd6c\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.244457 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.244441 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.244686 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.244462 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.244686 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.244569 2565 secret.go:281] references non-existent secret key: tls.crt Apr 24 19:11:48.244686 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.244584 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 19:11:48.244686 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.244601 2565 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 24 19:11:48.244686 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.244622 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 19:11:48.244879 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.244701 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates podName:7fd0509c-8cf9-480c-bad2-2e5b6f4622e0 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:48.744681191 +0000 UTC m=+311.382689180 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates") pod "keda-metrics-apiserver-7c9f485588-ndfjc" (UID: "7fd0509c-8cf9-480c-bad2-2e5b6f4622e0") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 24 19:11:48.244929 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.244900 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.260362 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.260334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxd6c\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-kube-api-access-rxd6c\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.268366 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.268334 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-tl9nq"] Apr 24 19:11:48.270641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.270625 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:48.273664 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.273647 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 19:11:48.283860 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.283835 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tl9nq"] Apr 24 19:11:48.446779 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.446740 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-certificates\") pod \"keda-admission-cf49989db-tl9nq\" (UID: \"7af42bbe-76ed-42af-a2c0-551d6650c2e3\") " pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:48.446939 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.446840 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2st4\" (UniqueName: \"kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-kube-api-access-q2st4\") pod \"keda-admission-cf49989db-tl9nq\" (UID: \"7af42bbe-76ed-42af-a2c0-551d6650c2e3\") " pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:48.446939 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.446875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:48.447096 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.446992 2565 secret.go:281] references non-existent secret key: ca.crt Apr 24 19:11:48.447096 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.447008 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 19:11:48.447096 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.447021 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-7crwv: references non-existent secret key: ca.crt Apr 24 19:11:48.447096 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.447090 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates podName:3a3c28d3-cf6a-41a3-832c-5f39aad4d155 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:49.447073338 +0000 UTC m=+312.085081325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates") pod "keda-operator-ffbb595cb-7crwv" (UID: "3a3c28d3-cf6a-41a3-832c-5f39aad4d155") : references non-existent secret key: ca.crt Apr 24 19:11:48.547312 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.547227 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-certificates\") pod \"keda-admission-cf49989db-tl9nq\" (UID: \"7af42bbe-76ed-42af-a2c0-551d6650c2e3\") " pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:48.547312 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.547291 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2st4\" (UniqueName: \"kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-kube-api-access-q2st4\") pod \"keda-admission-cf49989db-tl9nq\" (UID: \"7af42bbe-76ed-42af-a2c0-551d6650c2e3\") " pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:48.547536 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.547389 2565 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 24 19:11:48.547536 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.547419 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-tl9nq: secret "keda-admission-webhooks-certs" not found Apr 24 19:11:48.547536 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.547498 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-certificates podName:7af42bbe-76ed-42af-a2c0-551d6650c2e3 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:49.047476296 +0000 UTC m=+311.685484281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-certificates") pod "keda-admission-cf49989db-tl9nq" (UID: "7af42bbe-76ed-42af-a2c0-551d6650c2e3") : secret "keda-admission-webhooks-certs" not found Apr 24 19:11:48.557735 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.557708 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2st4\" (UniqueName: \"kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-kube-api-access-q2st4\") pod \"keda-admission-cf49989db-tl9nq\" (UID: \"7af42bbe-76ed-42af-a2c0-551d6650c2e3\") " pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:48.748740 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:48.748698 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:48.749249 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.748863 2565 secret.go:281] references non-existent secret key: tls.crt Apr 24 19:11:48.749249 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.748887 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 19:11:48.749249 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.748912 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc: references non-existent secret key: tls.crt Apr 24 19:11:48.749249 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:48.748998 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates podName:7fd0509c-8cf9-480c-bad2-2e5b6f4622e0 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:49.748960057 +0000 UTC m=+312.386968036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates") pod "keda-metrics-apiserver-7c9f485588-ndfjc" (UID: "7fd0509c-8cf9-480c-bad2-2e5b6f4622e0") : references non-existent secret key: tls.crt Apr 24 19:11:49.051989 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:49.051945 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-certificates\") pod \"keda-admission-cf49989db-tl9nq\" (UID: \"7af42bbe-76ed-42af-a2c0-551d6650c2e3\") " pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:49.054332 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:49.054303 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7af42bbe-76ed-42af-a2c0-551d6650c2e3-certificates\") pod \"keda-admission-cf49989db-tl9nq\" (UID: \"7af42bbe-76ed-42af-a2c0-551d6650c2e3\") " pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:49.180598 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:49.180550 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:49.303092 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:49.303017 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-tl9nq"] Apr 24 19:11:49.307245 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:11:49.307200 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af42bbe_76ed_42af_a2c0_551d6650c2e3.slice/crio-15afaa4905ed9459540ab776814ec5fecd8224168b05c961ec2fbb76d019270d WatchSource:0}: Error finding container 15afaa4905ed9459540ab776814ec5fecd8224168b05c961ec2fbb76d019270d: Status 404 returned error can't find the container with id 15afaa4905ed9459540ab776814ec5fecd8224168b05c961ec2fbb76d019270d Apr 24 19:11:49.456117 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:49.456079 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:49.456307 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.456225 2565 secret.go:281] references non-existent secret key: ca.crt Apr 24 19:11:49.456307 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.456241 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 24 19:11:49.456307 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.456251 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-7crwv: references non-existent secret key: ca.crt Apr 24 19:11:49.456307 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.456300 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates podName:3a3c28d3-cf6a-41a3-832c-5f39aad4d155 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:51.4562868 +0000 UTC m=+314.094294782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates") pod "keda-operator-ffbb595cb-7crwv" (UID: "3a3c28d3-cf6a-41a3-832c-5f39aad4d155") : references non-existent secret key: ca.crt Apr 24 19:11:49.758639 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:49.758596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:49.759063 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.758744 2565 secret.go:281] references non-existent secret key: tls.crt Apr 24 19:11:49.759063 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.758759 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 24 19:11:49.759063 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.758778 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc: references non-existent secret key: tls.crt Apr 24 19:11:49.759063 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:11:49.758831 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates podName:7fd0509c-8cf9-480c-bad2-2e5b6f4622e0 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:51.758815896 +0000 UTC m=+314.396823890 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates") pod "keda-metrics-apiserver-7c9f485588-ndfjc" (UID: "7fd0509c-8cf9-480c-bad2-2e5b6f4622e0") : references non-existent secret key: tls.crt Apr 24 19:11:49.902425 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:49.902386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tl9nq" event={"ID":"7af42bbe-76ed-42af-a2c0-551d6650c2e3","Type":"ContainerStarted","Data":"15afaa4905ed9459540ab776814ec5fecd8224168b05c961ec2fbb76d019270d"} Apr 24 19:11:50.909066 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:50.908960 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-tl9nq" event={"ID":"7af42bbe-76ed-42af-a2c0-551d6650c2e3","Type":"ContainerStarted","Data":"ac00155925d2e8d0d19af6f1b054dcde42ae326ce87c6c83839c2f50961b379d"} Apr 24 19:11:50.909437 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:50.909122 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:11:50.926056 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:50.925999 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-tl9nq" podStartSLOduration=1.7005311619999999 podStartE2EDuration="2.925966655s" podCreationTimestamp="2026-04-24 19:11:48 +0000 UTC" firstStartedPulling="2026-04-24 19:11:49.308818049 +0000 UTC m=+311.946826029" lastFinishedPulling="2026-04-24 19:11:50.534253543 +0000 UTC m=+313.172261522" observedRunningTime="2026-04-24 19:11:50.923995713 +0000 UTC m=+313.562003710" watchObservedRunningTime="2026-04-24 19:11:50.925966655 +0000 UTC m=+313.563974657" Apr 24 19:11:51.472171 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.472135 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:51.474532 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.474514 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/3a3c28d3-cf6a-41a3-832c-5f39aad4d155-certificates\") pod \"keda-operator-ffbb595cb-7crwv\" (UID: \"3a3c28d3-cf6a-41a3-832c-5f39aad4d155\") " pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:51.582766 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.582716 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:51.702315 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.702282 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-7crwv"] Apr 24 19:11:51.705753 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:11:51.705724 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a3c28d3_cf6a_41a3_832c_5f39aad4d155.slice/crio-c527490b913b2c0e9fe68152e71337b45ed89d7b06df3260eba5deef315238cd WatchSource:0}: Error finding container c527490b913b2c0e9fe68152e71337b45ed89d7b06df3260eba5deef315238cd: Status 404 returned error can't find the container with id c527490b913b2c0e9fe68152e71337b45ed89d7b06df3260eba5deef315238cd Apr 24 19:11:51.775240 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.775159 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:51.777548 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.777521 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/7fd0509c-8cf9-480c-bad2-2e5b6f4622e0-certificates\") pod \"keda-metrics-apiserver-7c9f485588-ndfjc\" (UID: \"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:51.912641 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.912605 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-7crwv" event={"ID":"3a3c28d3-cf6a-41a3-832c-5f39aad4d155","Type":"ContainerStarted","Data":"c527490b913b2c0e9fe68152e71337b45ed89d7b06df3260eba5deef315238cd"} Apr 24 19:11:51.963092 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:51.963057 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:52.099352 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:52.099321 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc"] Apr 24 19:11:52.101546 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:11:52.101519 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd0509c_8cf9_480c_bad2_2e5b6f4622e0.slice/crio-20940fb45a4b0e25029f82d4b6f3963d868dd65d72fa8387abea0893b4dd6c63 WatchSource:0}: Error finding container 20940fb45a4b0e25029f82d4b6f3963d868dd65d72fa8387abea0893b4dd6c63: Status 404 returned error can't find the container with id 20940fb45a4b0e25029f82d4b6f3963d868dd65d72fa8387abea0893b4dd6c63 Apr 24 19:11:52.917004 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:52.916953 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" event={"ID":"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0","Type":"ContainerStarted","Data":"20940fb45a4b0e25029f82d4b6f3963d868dd65d72fa8387abea0893b4dd6c63"} Apr 24 19:11:55.926791 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:55.926690 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-7crwv" event={"ID":"3a3c28d3-cf6a-41a3-832c-5f39aad4d155","Type":"ContainerStarted","Data":"f5d62483b0e8945b3fcfdb841f323c960fedb6c05cb0e2e035d9f5d0d884cede"} Apr 24 19:11:55.927273 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:55.926825 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:11:55.928148 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:55.928126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" event={"ID":"7fd0509c-8cf9-480c-bad2-2e5b6f4622e0","Type":"ContainerStarted","Data":"4f7095006e862232521d99ee12714d988079ba974503bdbeff88ec3b481a727d"} Apr 24 19:11:55.928264 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:55.928230 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:11:55.943842 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:55.943783 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-7crwv" podStartSLOduration=5.11067718 podStartE2EDuration="8.94376591s" podCreationTimestamp="2026-04-24 19:11:47 +0000 UTC" firstStartedPulling="2026-04-24 19:11:51.706924459 +0000 UTC m=+314.344932439" lastFinishedPulling="2026-04-24 19:11:55.540013187 +0000 UTC m=+318.178021169" observedRunningTime="2026-04-24 19:11:55.941548928 +0000 UTC m=+318.579556929" watchObservedRunningTime="2026-04-24 19:11:55.94376591 +0000 UTC m=+318.581773913" Apr 24 19:11:55.956612 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:11:55.956564 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" podStartSLOduration=4.526022683 podStartE2EDuration="7.956549152s" podCreationTimestamp="2026-04-24 19:11:48 +0000 UTC" firstStartedPulling="2026-04-24 19:11:52.102811878 +0000 UTC m=+314.740819857" lastFinishedPulling="2026-04-24 19:11:55.533338347 +0000 UTC m=+318.171346326" observedRunningTime="2026-04-24 19:11:55.955125009 +0000 UTC m=+318.593133012" watchObservedRunningTime="2026-04-24 19:11:55.956549152 +0000 UTC m=+318.594557153" Apr 24 19:12:06.936141 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:06.936113 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-ndfjc" Apr 24 19:12:08.900488 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:08.900451 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-wbvs6" Apr 24 19:12:11.914849 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:11.914821 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-tl9nq" Apr 24 19:12:16.933185 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:16.933153 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-7crwv" Apr 24 19:12:54.607343 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.607310 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-69vs7"] Apr 24 19:12:54.615461 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.615438 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:54.618309 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.618283 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-jpnwc\"" Apr 24 19:12:54.618459 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.618311 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 24 19:12:54.618459 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.618326 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:12:54.618459 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.618326 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:12:54.621441 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.621413 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-69vs7"] Apr 24 19:12:54.746036 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.745998 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nltz\" (UniqueName: \"kubernetes.io/projected/cbb0dc30-045f-4d72-a612-7aec91a13e0d-kube-api-access-6nltz\") pod \"llmisvc-controller-manager-68cc5db7c4-69vs7\" (UID: \"cbb0dc30-045f-4d72-a612-7aec91a13e0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:54.746235 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.746049 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbb0dc30-045f-4d72-a612-7aec91a13e0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-69vs7\" (UID: \"cbb0dc30-045f-4d72-a612-7aec91a13e0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:54.847280 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.847249 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6nltz\" (UniqueName: \"kubernetes.io/projected/cbb0dc30-045f-4d72-a612-7aec91a13e0d-kube-api-access-6nltz\") pod \"llmisvc-controller-manager-68cc5db7c4-69vs7\" (UID: \"cbb0dc30-045f-4d72-a612-7aec91a13e0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:54.847461 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.847302 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbb0dc30-045f-4d72-a612-7aec91a13e0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-69vs7\" (UID: \"cbb0dc30-045f-4d72-a612-7aec91a13e0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:54.847461 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:12:54.847431 2565 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 24 19:12:54.847569 ip-10-0-138-142 kubenswrapper[2565]: E0424 19:12:54.847504 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbb0dc30-045f-4d72-a612-7aec91a13e0d-cert podName:cbb0dc30-045f-4d72-a612-7aec91a13e0d nodeName:}" failed. No retries permitted until 2026-04-24 19:12:55.347480682 +0000 UTC m=+377.985488663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cbb0dc30-045f-4d72-a612-7aec91a13e0d-cert") pod "llmisvc-controller-manager-68cc5db7c4-69vs7" (UID: "cbb0dc30-045f-4d72-a612-7aec91a13e0d") : secret "llmisvc-webhook-server-cert" not found Apr 24 19:12:54.862397 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:54.862331 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nltz\" (UniqueName: \"kubernetes.io/projected/cbb0dc30-045f-4d72-a612-7aec91a13e0d-kube-api-access-6nltz\") pod \"llmisvc-controller-manager-68cc5db7c4-69vs7\" (UID: \"cbb0dc30-045f-4d72-a612-7aec91a13e0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:55.350736 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:55.350701 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbb0dc30-045f-4d72-a612-7aec91a13e0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-69vs7\" (UID: \"cbb0dc30-045f-4d72-a612-7aec91a13e0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:55.353161 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:55.353126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cbb0dc30-045f-4d72-a612-7aec91a13e0d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-69vs7\" (UID: \"cbb0dc30-045f-4d72-a612-7aec91a13e0d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:55.525934 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:55.525886 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:55.643303 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:55.643257 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-69vs7"] Apr 24 19:12:55.645897 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:12:55.645868 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcbb0dc30_045f_4d72_a612_7aec91a13e0d.slice/crio-c69e0d0b9502b440611e4e71d9192b5c0c60f7913655c25348a530f9a7186f85 WatchSource:0}: Error finding container c69e0d0b9502b440611e4e71d9192b5c0c60f7913655c25348a530f9a7186f85: Status 404 returned error can't find the container with id c69e0d0b9502b440611e4e71d9192b5c0c60f7913655c25348a530f9a7186f85 Apr 24 19:12:56.092195 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:56.092154 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" event={"ID":"cbb0dc30-045f-4d72-a612-7aec91a13e0d","Type":"ContainerStarted","Data":"c69e0d0b9502b440611e4e71d9192b5c0c60f7913655c25348a530f9a7186f85"} Apr 24 19:12:58.098108 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:58.098070 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" event={"ID":"cbb0dc30-045f-4d72-a612-7aec91a13e0d","Type":"ContainerStarted","Data":"b6fc9b1fe7f3f883ceb871d8b9db66f504efba80c9da8c083b44620a320f636e"} Apr 24 19:12:58.098563 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:58.098182 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:12:58.116103 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:12:58.116056 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" podStartSLOduration=2.218411111 podStartE2EDuration="4.116041035s" podCreationTimestamp="2026-04-24 19:12:54 +0000 UTC" firstStartedPulling="2026-04-24 19:12:55.647217669 +0000 UTC m=+378.285225647" lastFinishedPulling="2026-04-24 19:12:57.544847579 +0000 UTC m=+380.182855571" observedRunningTime="2026-04-24 19:12:58.115003391 +0000 UTC m=+380.753011392" watchObservedRunningTime="2026-04-24 19:12:58.116041035 +0000 UTC m=+380.754049036" Apr 24 19:13:29.103395 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:13:29.103366 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-69vs7" Apr 24 19:16:37.877884 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:16:37.877843 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:16:37.879526 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:16:37.879505 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:21:37.899027 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:21:37.898997 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:21:37.900254 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:21:37.900236 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:26:37.916485 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:26:37.916454 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:26:37.919780 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:26:37.919759 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:29:00.974196 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:00.974114 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-qh6dj_71bb1727-5459-4ea6-ade3-0cf8e2a63038/global-pull-secret-syncer/0.log" Apr 24 19:29:01.054229 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:01.054198 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-8wdng_f92f49a6-73c3-429f-83d7-d170a250611b/konnectivity-agent/0.log" Apr 24 19:29:01.111893 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:01.111858 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-142.ec2.internal_027943e939a2d76cdb600f777d89968b/haproxy/0.log" Apr 24 19:29:05.178294 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:05.178258 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nxnw4_d9dc8854-e8d8-4f1d-a016-3d7507cdf28e/node-exporter/0.log" Apr 24 19:29:05.194417 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:05.194394 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nxnw4_d9dc8854-e8d8-4f1d-a016-3d7507cdf28e/kube-rbac-proxy/0.log" Apr 24 19:29:05.213433 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:05.213389 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-nxnw4_d9dc8854-e8d8-4f1d-a016-3d7507cdf28e/init-textfile/0.log" Apr 24 19:29:05.446470 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:05.446384 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rhjv8_806d34d7-6f49-489a-8b8f-c6ae79b29adb/prometheus-operator/0.log" Apr 24 19:29:05.460290 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:05.460263 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-rhjv8_806d34d7-6f49-489a-8b8f-c6ae79b29adb/kube-rbac-proxy/0.log" Apr 24 19:29:08.233319 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.233278 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l"] Apr 24 19:29:08.236369 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.236348 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.238845 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.238823 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jzlrb\"/\"kube-root-ca.crt\"" Apr 24 19:29:08.239798 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.239782 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jzlrb\"/\"openshift-service-ca.crt\"" Apr 24 19:29:08.239896 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.239812 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jzlrb\"/\"default-dockercfg-df8sl\"" Apr 24 19:29:08.244805 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.244778 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l"] Apr 24 19:29:08.373559 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.373524 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-podres\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.373559 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.373567 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfnfh\" (UniqueName: \"kubernetes.io/projected/fc8218de-7779-47b8-bfb8-1c44936eb908-kube-api-access-xfnfh\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.373769 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.373585 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-lib-modules\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.373769 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.373615 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-sys\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.373769 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.373644 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-proc\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474024 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.473964 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-sys\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474202 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474042 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-proc\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474202 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474078 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-podres\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474202 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474109 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xfnfh\" (UniqueName: \"kubernetes.io/projected/fc8218de-7779-47b8-bfb8-1c44936eb908-kube-api-access-xfnfh\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474202 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474126 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-sys\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474202 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474134 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-proc\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474202 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474136 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-lib-modules\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474409 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474228 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-podres\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.474409 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.474345 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fc8218de-7779-47b8-bfb8-1c44936eb908-lib-modules\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.483156 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.483127 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfnfh\" (UniqueName: \"kubernetes.io/projected/fc8218de-7779-47b8-bfb8-1c44936eb908-kube-api-access-xfnfh\") pod \"perf-node-gather-daemonset-c4p5l\" (UID: \"fc8218de-7779-47b8-bfb8-1c44936eb908\") " pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.547379 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.547296 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:08.668093 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.668058 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l"] Apr 24 19:29:08.671606 ip-10-0-138-142 kubenswrapper[2565]: W0424 19:29:08.671570 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podfc8218de_7779_47b8_bfb8_1c44936eb908.slice/crio-c2b4b37c65a093b8b9b6f5c8b101f76edf97e81629a3c589d3b7d7340f6f5bb6 WatchSource:0}: Error finding container c2b4b37c65a093b8b9b6f5c8b101f76edf97e81629a3c589d3b7d7340f6f5bb6: Status 404 returned error can't find the container with id c2b4b37c65a093b8b9b6f5c8b101f76edf97e81629a3c589d3b7d7340f6f5bb6 Apr 24 19:29:08.673149 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.673132 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:29:08.730535 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.730500 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" event={"ID":"fc8218de-7779-47b8-bfb8-1c44936eb908","Type":"ContainerStarted","Data":"c2b4b37c65a093b8b9b6f5c8b101f76edf97e81629a3c589d3b7d7340f6f5bb6"} Apr 24 19:29:08.809237 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.809215 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kn269_d274396b-9b0c-4277-a460-e66f42e9cf03/dns/0.log" Apr 24 19:29:08.831554 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.831527 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kn269_d274396b-9b0c-4277-a460-e66f42e9cf03/kube-rbac-proxy/0.log" Apr 24 19:29:08.981506 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:08.981472 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-ltbx5_1aec280c-4ff6-4d47-a34b-c89512f4e5f0/dns-node-resolver/0.log" Apr 24 19:29:09.496092 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:09.496064 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7kd45_281b0be6-1dcd-4e12-a8a1-7617ae9111e8/node-ca/0.log" Apr 24 19:29:09.734137 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:09.734104 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" event={"ID":"fc8218de-7779-47b8-bfb8-1c44936eb908","Type":"ContainerStarted","Data":"d8f7f0cef806b30f0de4c69292df27942d21d4b2e9810eab019ceb50b37cf2bc"} Apr 24 19:29:09.734314 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:09.734268 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:09.750875 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:09.750795 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" podStartSLOduration=1.750782138 podStartE2EDuration="1.750782138s" podCreationTimestamp="2026-04-24 19:29:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:29:09.75038421 +0000 UTC m=+1352.388392210" watchObservedRunningTime="2026-04-24 19:29:09.750782138 +0000 UTC m=+1352.388790136" Apr 24 19:29:10.586782 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:10.586726 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-77k7z_cfdbf694-67b0-45a2-848c-4bd1f8440a3d/serve-healthcheck-canary/0.log" Apr 24 19:29:10.981041 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:10.981011 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6f8xt_cd7f4efb-5956-404b-8513-49f0a6384dd1/kube-rbac-proxy/0.log" Apr 24 19:29:11.001104 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:11.001073 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6f8xt_cd7f4efb-5956-404b-8513-49f0a6384dd1/exporter/0.log" Apr 24 19:29:11.022437 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:11.022407 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6f8xt_cd7f4efb-5956-404b-8513-49f0a6384dd1/extractor/0.log" Apr 24 19:29:13.128913 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:13.128882 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-69vs7_cbb0dc30-045f-4d72-a612-7aec91a13e0d/manager/0.log" Apr 24 19:29:15.746408 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:15.746380 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jzlrb/perf-node-gather-daemonset-c4p5l" Apr 24 19:29:18.358207 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.358181 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zxkd_5998df72-185d-4c96-bec0-af611d2d2173/kube-multus-additional-cni-plugins/0.log" Apr 24 19:29:18.375618 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.375583 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zxkd_5998df72-185d-4c96-bec0-af611d2d2173/egress-router-binary-copy/0.log" Apr 24 19:29:18.391984 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.391935 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zxkd_5998df72-185d-4c96-bec0-af611d2d2173/cni-plugins/0.log" Apr 24 19:29:18.409022 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.408995 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zxkd_5998df72-185d-4c96-bec0-af611d2d2173/bond-cni-plugin/0.log" Apr 24 19:29:18.425435 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.425399 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zxkd_5998df72-185d-4c96-bec0-af611d2d2173/routeoverride-cni/0.log" Apr 24 19:29:18.441670 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.441635 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zxkd_5998df72-185d-4c96-bec0-af611d2d2173/whereabouts-cni-bincopy/0.log" Apr 24 19:29:18.457016 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.456994 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6zxkd_5998df72-185d-4c96-bec0-af611d2d2173/whereabouts-cni/0.log" Apr 24 19:29:18.824892 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.824858 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z9xw6_d318f255-7c0b-4460-b175-7e79533a85e2/kube-multus/0.log" Apr 24 19:29:18.842184 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.842154 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dxx98_5bd705be-109f-4e28-aaaf-136053b1e384/network-metrics-daemon/0.log" Apr 24 19:29:18.859436 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:18.859411 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-dxx98_5bd705be-109f-4e28-aaaf-136053b1e384/kube-rbac-proxy/0.log" Apr 24 19:29:20.034511 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.034476 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-controller/0.log" Apr 24 19:29:20.049393 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.049365 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/0.log" Apr 24 19:29:20.060645 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.060617 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovn-acl-logging/1.log" Apr 24 19:29:20.078231 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.078210 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/kube-rbac-proxy-node/0.log" Apr 24 19:29:20.097611 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.097567 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 19:29:20.113857 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.113829 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/northd/0.log" Apr 24 19:29:20.133690 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.133665 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/nbdb/0.log" Apr 24 19:29:20.150893 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.150871 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/sbdb/0.log" Apr 24 19:29:20.303665 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:20.303587 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5frl6_ac65a5f6-8127-4bb6-a3be-c5ed95f56ae6/ovnkube-controller/0.log" Apr 24 19:29:21.495149 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:21.495124 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-7kd4p_127bc72d-f6ec-4cad-8e85-c35350f4ded7/network-check-target-container/0.log" Apr 24 19:29:22.359296 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:22.355208 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-f2bnb_917aae83-6ba1-4789-900b-362379b7e091/iptables-alerter/0.log" Apr 24 19:29:22.978583 ip-10-0-138-142 kubenswrapper[2565]: I0424 19:29:22.978539 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-4k6xp_82b67635-7eb8-44b6-9ea2-8d4e7edf454a/tuned/0.log"