Apr 24 16:36:32.129813 ip-10-0-137-83 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 16:36:32.129827 ip-10-0-137-83 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 16:36:32.129837 ip-10-0-137-83 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 16:36:32.130141 ip-10-0-137-83 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 16:36:42.246145 ip-10-0-137-83 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 16:36:42.246165 ip-10-0-137-83 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 98fcc827967b4982bc30c8c6dfa94917 -- Apr 24 16:39:06.908433 ip-10-0-137-83 systemd[1]: Starting Kubernetes Kubelet... Apr 24 16:39:07.277301 ip-10-0-137-83 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:07.277301 ip-10-0-137-83 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 16:39:07.277301 ip-10-0-137-83 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:07.277301 ip-10-0-137-83 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 16:39:07.277301 ip-10-0-137-83 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 16:39:07.279425 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.279334 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 16:39:07.285269 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285245 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.285269 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285265 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.285269 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285270 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.285269 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285275 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.285269 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285279 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285282 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285285 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285288 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285290 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285293 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285296 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285299 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285302 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285304 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285307 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285309 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285311 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285314 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285316 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285318 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285321 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285324 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285326 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285329 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.285467 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285331 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285334 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285346 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285349 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285352 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285360 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285362 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285365 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285367 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285370 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285372 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285375 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285377 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285380 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285384 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285386 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285390 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285392 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285395 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285398 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.285955 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285400 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285403 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285406 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285408 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285410 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285413 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285416 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285418 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285421 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285423 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285426 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285428 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285431 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285433 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285436 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285438 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285441 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285443 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285447 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.286453 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285451 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285454 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285457 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285459 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285462 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285465 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285468 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285471 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285474 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285477 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285480 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285483 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285487 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285490 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285492 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285501 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285504 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285507 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285509 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285512 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.286943 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285514 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285517 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.285519 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286673 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286681 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286684 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286687 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286689 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286692 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286697 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286700 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286704 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286706 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286709 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286712 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286714 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286717 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286719 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286722 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286725 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.287473 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286728 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286730 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286733 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286735 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286738 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286741 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286743 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286748 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286750 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286753 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286755 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286758 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286761 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286763 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286766 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286768 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286771 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286774 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286776 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286779 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.287950 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286781 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286784 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286786 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286789 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286791 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286795 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286797 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286800 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286802 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286805 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286807 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286811 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286813 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286816 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286819 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286821 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286824 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286827 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286829 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286832 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.288494 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286835 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286837 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286840 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286842 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286844 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286847 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286849 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286853 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286856 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286858 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286861 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286863 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286866 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286868 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286872 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286875 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286878 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286881 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286883 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.288982 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286886 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286888 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286891 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286893 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286896 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286899 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286901 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286904 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286906 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.286909 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.286979 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.286986 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.286993 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.286998 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287004 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287007 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287011 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287016 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287019 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287022 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287026 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287030 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 16:39:07.289470 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287033 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287036 2575 flags.go:64] FLAG: --cgroup-root="" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287038 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287041 2575 flags.go:64] FLAG: --client-ca-file="" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287044 2575 flags.go:64] FLAG: --cloud-config="" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287047 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287050 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287054 2575 flags.go:64] FLAG: --cluster-domain="" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287057 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287059 2575 flags.go:64] FLAG: --config-dir="" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287062 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287066 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287070 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287073 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287076 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287079 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287082 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287085 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287087 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287090 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287093 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287097 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287118 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287121 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287124 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 16:39:07.290004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287128 2575 flags.go:64] FLAG: --enable-server="true" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287131 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287136 2575 flags.go:64] FLAG: --event-burst="100" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287139 2575 flags.go:64] FLAG: --event-qps="50" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287142 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287145 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287156 2575 flags.go:64] FLAG: --eviction-hard="" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287161 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287164 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287167 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287170 2575 flags.go:64] FLAG: --eviction-soft="" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287173 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287176 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287179 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287181 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287184 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287187 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287190 2575 flags.go:64] FLAG: --feature-gates="" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287194 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287197 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287199 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287203 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287206 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287208 2575 flags.go:64] FLAG: --help="false" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287211 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.290616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287214 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287217 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287220 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287223 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287226 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287229 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287232 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287235 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287238 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287241 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287244 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287247 2575 flags.go:64] FLAG: --kube-reserved="" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287250 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287252 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287255 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287258 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287261 2575 flags.go:64] FLAG: --lock-file="" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287264 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287267 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287270 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287275 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287277 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287280 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 16:39:07.291249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287283 2575 flags.go:64] FLAG: --logging-format="text" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287286 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287289 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287292 2575 flags.go:64] FLAG: --manifest-url="" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287294 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287299 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287302 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287309 2575 flags.go:64] FLAG: --max-pods="110" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287311 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287314 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287317 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287320 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287323 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287326 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287329 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287337 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287340 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287342 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287346 2575 flags.go:64] FLAG: --pod-cidr="" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287349 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287355 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287358 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287361 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287364 2575 flags.go:64] FLAG: --port="10250" Apr 24 16:39:07.291839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287366 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287370 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0657788c5e80c048c" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287373 2575 flags.go:64] FLAG: --qos-reserved="" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287376 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287379 2575 flags.go:64] FLAG: --register-node="true" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287382 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287385 2575 flags.go:64] FLAG: --register-with-taints="" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287388 2575 flags.go:64] FLAG: --registry-burst="10" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287391 2575 flags.go:64] FLAG: --registry-qps="5" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287394 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287396 2575 flags.go:64] FLAG: --reserved-memory="" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287400 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287403 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287405 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287408 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287411 2575 flags.go:64] FLAG: --runonce="false" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287413 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287416 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287419 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287422 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287427 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287430 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287433 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287436 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287438 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287441 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 16:39:07.292443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287444 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287447 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287450 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287453 2575 flags.go:64] FLAG: --system-cgroups="" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287456 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287461 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287464 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287467 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287471 2575 flags.go:64] FLAG: --tls-min-version="" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287474 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287476 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287479 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287482 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287485 2575 flags.go:64] FLAG: --v="2" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287490 2575 flags.go:64] FLAG: --version="false" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287494 2575 flags.go:64] FLAG: --vmodule="" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287498 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.287501 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287598 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287603 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287607 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287610 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287613 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.293058 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287615 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287618 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287620 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287625 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287627 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287629 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287632 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287634 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287637 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287640 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287642 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287645 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287648 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287650 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287653 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287655 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287658 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287661 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287663 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287666 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.293659 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287668 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287671 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287673 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287676 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287679 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287681 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287683 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287686 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287688 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287691 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287693 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287696 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287698 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287701 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287703 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287707 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287709 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287712 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287714 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287717 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.294210 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287719 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287721 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287724 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287726 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287729 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287732 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287734 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287737 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287739 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287742 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287744 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287747 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287749 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287751 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287754 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287756 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287759 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287761 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287764 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287766 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.294690 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287768 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287771 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287773 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287776 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287778 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287780 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287783 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287787 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287789 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287791 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287794 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287796 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287798 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287801 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287803 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287805 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287809 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287812 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287815 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287817 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.295226 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.287823 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.295708 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.288433 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:07.295921 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.295903 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 16:39:07.295953 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.295921 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 16:39:07.295983 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295972 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.295983 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295978 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.295983 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295981 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.295983 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295984 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295988 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295991 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295994 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295997 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.295999 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296003 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296005 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296008 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296010 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296013 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296015 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296018 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296021 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296023 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296026 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296028 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296031 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296034 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.296082 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296036 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296039 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296041 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296044 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296048 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296052 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296056 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296058 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296061 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296063 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296066 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296068 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296070 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296073 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296075 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296078 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296081 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296084 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296087 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296089 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.296627 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296092 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296095 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296099 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296120 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296125 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296129 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296134 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296138 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296141 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296143 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296146 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296148 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296151 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296153 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296156 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296158 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296163 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296165 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296168 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.297159 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296170 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296173 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296175 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296177 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296180 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296182 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296185 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296187 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296191 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296193 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296196 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296198 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296200 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296203 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296205 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296208 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296210 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296213 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296215 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296217 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.297617 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296220 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296222 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296225 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296227 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296229 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.296235 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296340 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296345 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296350 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296353 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296356 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296359 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296361 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296364 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296366 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 16:39:07.298097 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296368 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296371 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296374 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296376 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296379 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296381 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296384 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296386 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296389 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296391 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296394 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296396 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296398 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296401 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296403 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296405 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296408 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296410 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296413 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296415 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 16:39:07.298475 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296417 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296420 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296422 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296425 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296427 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296430 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296432 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296435 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296438 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296440 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296443 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296445 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296448 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296450 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296453 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296455 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296457 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296460 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296462 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296464 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 16:39:07.298944 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296467 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296469 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296472 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296474 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296477 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296479 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296482 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296484 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296486 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296489 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296491 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296494 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296496 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296498 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296501 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296504 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296506 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296508 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296511 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 16:39:07.299436 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296513 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296516 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296519 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296522 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296525 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296528 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296531 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296534 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296538 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296541 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296543 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296546 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296548 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296550 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296553 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296555 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296557 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 16:39:07.299894 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:07.296560 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 16:39:07.300314 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.296565 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 16:39:07.300314 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.297315 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 16:39:07.300314 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.300118 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 16:39:07.300927 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.300916 2575 server.go:1019] "Starting client certificate rotation" Apr 24 16:39:07.301030 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.301013 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:07.301065 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.301055 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 16:39:07.322778 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.322760 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:07.325148 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.325125 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 16:39:07.335946 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.335922 2575 log.go:25] "Validated CRI v1 runtime API" Apr 24 16:39:07.341063 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.341048 2575 log.go:25] "Validated CRI v1 image API" Apr 24 16:39:07.342438 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.342424 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 16:39:07.346088 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.346068 2575 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9b57dd99-6bbe-4870-b6a4-892f3a5b0a3c:/dev/nvme0n1p4 c2fdc972-96ef-4b27-ab2b-8e437398f84f:/dev/nvme0n1p3] Apr 24 16:39:07.346172 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.346088 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 16:39:07.351434 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.351330 2575 manager.go:217] Machine: {Timestamp:2026-04-24 16:39:07.349642467 +0000 UTC m=+0.340331275 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3033371 MemoryCapacity:32812163072 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28f352b14c7392bc8d00e981dc7a39 SystemUUID:ec28f352-b14c-7392-bc8d-00e981dc7a39 BootID:98fcc827-967b-4982-bc30-c8c6dfa94917 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406081536 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:23:1c:6e:88:b7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:23:1c:6e:88:b7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:72:3e:04:57:8a:cf Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812163072 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 16:39:07.351434 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.351424 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 16:39:07.351557 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.351525 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 16:39:07.351983 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.351967 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:07.352439 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.352415 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 16:39:07.352572 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.352440 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-83.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 16:39:07.352622 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.352598 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 16:39:07.352622 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.352607 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 16:39:07.352622 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.352619 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:07.353334 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.353323 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 16:39:07.354181 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.354169 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:07.354419 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.354410 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 16:39:07.356379 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.356369 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 24 16:39:07.356420 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.356382 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 16:39:07.356420 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.356393 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 16:39:07.356420 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.356402 2575 kubelet.go:397] "Adding apiserver pod source" Apr 24 16:39:07.356420 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.356413 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 16:39:07.357392 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.357379 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:07.357392 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.357396 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 16:39:07.359958 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.359943 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 16:39:07.363603 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.363580 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 16:39:07.365289 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365272 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365298 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365306 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365312 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365317 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365322 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365328 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365333 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365340 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 16:39:07.365347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365346 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 16:39:07.365586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365361 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 16:39:07.365586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.365371 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 16:39:07.366036 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.366026 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 16:39:07.366065 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.366036 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 16:39:07.369329 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.369306 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 16:39:07.369411 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.369346 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 16:39:07.369411 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.369372 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-83.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 16:39:07.369479 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.369473 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 16:39:07.369517 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.369505 2575 server.go:1295] "Started kubelet" Apr 24 16:39:07.369635 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.369586 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 16:39:07.369725 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.369653 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 16:39:07.369725 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.369720 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 16:39:07.370299 ip-10-0-137-83 systemd[1]: Started Kubernetes Kubelet. Apr 24 16:39:07.370918 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.370800 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 16:39:07.371500 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.371486 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 24 16:39:07.376549 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.376527 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:07.377039 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.377016 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 16:39:07.378260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.378240 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 16:39:07.378375 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.378364 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 16:39:07.378566 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.378556 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 16:39:07.378677 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.378669 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 24 16:39:07.378746 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.378738 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 24 16:39:07.378817 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.378780 2575 factory.go:55] Registering systemd factory Apr 24 16:39:07.378908 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.378889 2575 factory.go:223] Registration of the systemd container factory successfully Apr 24 16:39:07.379532 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.379489 2575 factory.go:153] Registering CRI-O factory Apr 24 16:39:07.379532 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.379509 2575 factory.go:223] Registration of the crio container factory successfully Apr 24 16:39:07.379676 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.379597 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 16:39:07.379676 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.379636 2575 factory.go:103] Registering Raw factory Apr 24 16:39:07.379676 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.379655 2575 manager.go:1196] Started watching for new ooms in manager Apr 24 16:39:07.379905 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.379885 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:07.380078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.380064 2575 manager.go:319] Starting recovery of all containers Apr 24 16:39:07.380932 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.380909 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 16:39:07.384959 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.384927 2575 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-83.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 16:39:07.385283 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.385151 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 16:39:07.385757 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.384896 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-83.ec2.internal.18a9586ba782c1ca default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-83.ec2.internal,UID:ip-10-0-137-83.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-83.ec2.internal,},FirstTimestamp:2026-04-24 16:39:07.36948065 +0000 UTC m=+0.360169457,LastTimestamp:2026-04-24 16:39:07.36948065 +0000 UTC m=+0.360169457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-83.ec2.internal,}" Apr 24 16:39:07.392967 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.392949 2575 manager.go:324] Recovery completed Apr 24 16:39:07.397737 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.397721 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:07.399401 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.399382 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bphnf" Apr 24 16:39:07.400140 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.400126 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:07.400215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.400153 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:07.400215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.400162 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:07.400658 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.400643 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 16:39:07.400658 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.400655 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 16:39:07.400746 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.400671 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 24 16:39:07.402290 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.402207 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-83.ec2.internal.18a9586ba9569630 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-83.ec2.internal,UID:ip-10-0-137-83.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-83.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-83.ec2.internal,},FirstTimestamp:2026-04-24 16:39:07.400140336 +0000 UTC m=+0.390829145,LastTimestamp:2026-04-24 16:39:07.400140336 +0000 UTC m=+0.390829145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-83.ec2.internal,}" Apr 24 16:39:07.402949 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.402932 2575 policy_none.go:49] "None policy: Start" Apr 24 16:39:07.402949 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.402950 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 16:39:07.403157 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.402961 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 24 16:39:07.407751 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.407725 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-bphnf" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.443244 2575 manager.go:341] "Starting Device Plugin manager" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.443270 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.443284 2575 server.go:85] "Starting device plugin registration server" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.443479 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.443492 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.443599 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.443680 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.443709 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.444122 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 16:39:07.447359 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.444162 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:07.502845 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.502807 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 16:39:07.504116 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.504080 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 16:39:07.504181 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.504128 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 16:39:07.504181 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.504150 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 16:39:07.504181 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.504161 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 16:39:07.504291 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.504198 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 16:39:07.506542 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.506522 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:07.543760 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.543696 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:07.544726 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.544710 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:07.544814 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.544742 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:07.544814 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.544757 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:07.544814 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.544786 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.556543 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.556525 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.556607 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.556546 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-83.ec2.internal\": node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:07.572788 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.572768 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:07.605197 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.605169 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal"] Apr 24 16:39:07.605293 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.605246 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:07.606033 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.606020 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:07.606131 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.606046 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:07.606131 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.606062 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:07.607376 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.607361 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:07.607523 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.607509 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.607578 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.607536 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:07.608046 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.608033 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:07.608125 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.608033 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:07.608125 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.608094 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:07.608196 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.608126 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:07.608196 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.608061 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:07.608196 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.608155 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:07.609320 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.609306 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.609364 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.609330 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 16:39:07.609913 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.609898 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientMemory" Apr 24 16:39:07.610007 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.609931 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 16:39:07.610007 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.609944 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeHasSufficientPID" Apr 24 16:39:07.637629 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.637612 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-83.ec2.internal\" not found" node="ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.641752 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.641738 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-83.ec2.internal\" not found" node="ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.673392 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.673367 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:07.680665 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.680644 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ba611364bea52f53a72b078e3fdc49f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal\" (UID: \"7ba611364bea52f53a72b078e3fdc49f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.680726 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.680670 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d5a1f5c4f174a8faa48510e8386159fc-config\") pod \"kube-apiserver-proxy-ip-10-0-137-83.ec2.internal\" (UID: \"d5a1f5c4f174a8faa48510e8386159fc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.680726 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.680687 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ba611364bea52f53a72b078e3fdc49f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal\" (UID: \"7ba611364bea52f53a72b078e3fdc49f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.773963 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.773930 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:07.781323 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.781300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d5a1f5c4f174a8faa48510e8386159fc-config\") pod \"kube-apiserver-proxy-ip-10-0-137-83.ec2.internal\" (UID: \"d5a1f5c4f174a8faa48510e8386159fc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.781393 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.781330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ba611364bea52f53a72b078e3fdc49f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal\" (UID: \"7ba611364bea52f53a72b078e3fdc49f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.781393 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.781348 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ba611364bea52f53a72b078e3fdc49f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal\" (UID: \"7ba611364bea52f53a72b078e3fdc49f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.781393 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.781377 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7ba611364bea52f53a72b078e3fdc49f-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal\" (UID: \"7ba611364bea52f53a72b078e3fdc49f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.781507 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.781405 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7ba611364bea52f53a72b078e3fdc49f-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal\" (UID: \"7ba611364bea52f53a72b078e3fdc49f\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.781507 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.781403 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d5a1f5c4f174a8faa48510e8386159fc-config\") pod \"kube-apiserver-proxy-ip-10-0-137-83.ec2.internal\" (UID: \"d5a1f5c4f174a8faa48510e8386159fc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.874744 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.874685 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:07.939306 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.939285 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.944870 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:07.944853 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" Apr 24 16:39:07.975807 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:07.975781 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.076397 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:08.076360 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.176914 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:08.176828 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.277466 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:08.277418 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.300874 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.300853 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 16:39:08.301022 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.301007 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 16:39:08.377247 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.377216 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 16:39:08.377666 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:08.377653 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.410025 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.409918 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 16:34:07 +0000 UTC" deadline="2027-12-17 16:41:15.842507542 +0000 UTC" Apr 24 16:39:08.410025 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.409972 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14448h2m7.432539882s" Apr 24 16:39:08.411363 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.411345 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 16:39:08.459487 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:08.459453 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a1f5c4f174a8faa48510e8386159fc.slice/crio-33f66135c826be3a2d2966cb2573f02e50e9c804a0a7d8303377d95ff5b08720 WatchSource:0}: Error finding container 33f66135c826be3a2d2966cb2573f02e50e9c804a0a7d8303377d95ff5b08720: Status 404 returned error can't find the container with id 33f66135c826be3a2d2966cb2573f02e50e9c804a0a7d8303377d95ff5b08720 Apr 24 16:39:08.459954 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:08.459929 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7ba611364bea52f53a72b078e3fdc49f.slice/crio-50cf3992f07920741fc5f14f6965377a5c17cb39852bad27ca1906605717e9ce WatchSource:0}: Error finding container 50cf3992f07920741fc5f14f6965377a5c17cb39852bad27ca1906605717e9ce: Status 404 returned error can't find the container with id 50cf3992f07920741fc5f14f6965377a5c17cb39852bad27ca1906605717e9ce Apr 24 16:39:08.461771 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.461751 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7vjn6" Apr 24 16:39:08.463309 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.463292 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:39:08.472421 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.472401 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7vjn6" Apr 24 16:39:08.478285 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:08.478256 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.507684 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.507638 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" event={"ID":"7ba611364bea52f53a72b078e3fdc49f","Type":"ContainerStarted","Data":"50cf3992f07920741fc5f14f6965377a5c17cb39852bad27ca1906605717e9ce"} Apr 24 16:39:08.508579 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.508557 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" event={"ID":"d5a1f5c4f174a8faa48510e8386159fc","Type":"ContainerStarted","Data":"33f66135c826be3a2d2966cb2573f02e50e9c804a0a7d8303377d95ff5b08720"} Apr 24 16:39:08.578750 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:08.578722 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.625777 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.625757 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:08.678787 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:08.678766 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-83.ec2.internal\" not found" Apr 24 16:39:08.768739 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.768685 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:08.779199 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.779181 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" Apr 24 16:39:08.794352 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.794334 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:08.796141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.796123 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" Apr 24 16:39:08.805661 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.805643 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 16:39:08.832994 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:08.832932 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:09.316262 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.316226 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 16:39:09.358185 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.358151 2575 apiserver.go:52] "Watching apiserver" Apr 24 16:39:09.366758 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.366734 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 16:39:09.368636 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.368613 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-m22zz","kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz","openshift-cluster-node-tuning-operator/tuned-p5486","openshift-image-registry/node-ca-vvwg9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal","openshift-multus/multus-additional-cni-plugins-b9xc2","openshift-network-diagnostics/network-check-target-2sm9w","kube-system/global-pull-secret-syncer-pqspn","kube-system/konnectivity-agent-zfvwh","openshift-dns/node-resolver-49ml9","openshift-multus/multus-dq9ms","openshift-multus/network-metrics-daemon-tgkjm","openshift-network-operator/iptables-alerter-b7wgk"] Apr 24 16:39:09.371281 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.371255 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.371655 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.371634 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:09.371760 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.371719 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:09.372540 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.372518 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.373651 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.373626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.374706 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.374686 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 16:39:09.374809 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.374798 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 16:39:09.374889 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.374868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.374889 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.374881 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 16:39:09.375142 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.375125 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-crgbx\"" Apr 24 16:39:09.375142 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.374686 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 16:39:09.375393 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.375372 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 16:39:09.375592 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.375579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-8jfrg\"" Apr 24 16:39:09.375643 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.375612 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 16:39:09.375698 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.375686 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 16:39:09.375770 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.375749 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 16:39:09.376424 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.376406 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:09.376537 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.376490 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-z9jzk\"" Apr 24 16:39:09.376608 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.376542 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.376879 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.376861 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:09.377746 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.377724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:09.377846 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.377806 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:09.380232 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.380210 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.380326 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.380287 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.382839 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.382820 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.383690 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.383674 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 16:39:09.383787 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.383755 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 16:39:09.383846 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.383798 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 16:39:09.384225 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.384206 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9h4fl\"" Apr 24 16:39:09.384328 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.384249 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.385081 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.385065 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 16:39:09.385261 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.385091 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 16:39:09.385261 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.385246 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bbbr5\"" Apr 24 16:39:09.385456 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.385436 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-7wgt2\"" Apr 24 16:39:09.385914 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.385897 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.385994 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.385960 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:09.387302 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.387282 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 16:39:09.387528 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.387510 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 16:39:09.387707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.387691 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 16:39:09.389413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.389391 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wwhfn\"" Apr 24 16:39:09.389484 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.389442 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 16:39:09.390543 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.390519 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392052 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-system-cni-dir\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-cni-bin\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392161 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-cni-multus\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392329 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3829df52-8015-4e76-945f-372f684a4e9c-etc-tuned\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-run-ovn-kubernetes\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392402 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392416 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-env-overrides\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392442 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392593 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e65a987-99ed-48bc-a17d-431dde198e65-cni-binary-copy\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392618 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392690 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-var-lib-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392710 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392723 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-ovn\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392756 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-node-log\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09d386a-3466-46d1-a1d1-efb87cc77eba-serviceca\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392826 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjfv\" (UniqueName: \"kubernetes.io/projected/f8b7b6cb-c76c-42e3-9193-9423bbd58047-kube-api-access-dbjfv\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qk2v\" (UniqueName: \"kubernetes.io/projected/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-kube-api-access-9qk2v\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.394453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392198 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.392992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-os-release\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393029 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-kubelet\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9n45\" (UniqueName: \"kubernetes.io/projected/2e65a987-99ed-48bc-a17d-431dde198e65-kube-api-access-d9n45\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393154 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-host\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393181 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-log-socket\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393216 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393224 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lcsbh\"" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-sys-fs\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-multus-certs\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393444 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393472 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-run\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393483 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mlsxz\"" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-var-lib-kubelet\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393599 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-systemd-units\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-systemd\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393738 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-registration-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393824 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f09d386a-3466-46d1-a1d1-efb87cc77eba-host\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.395454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwwfs\" (UniqueName: \"kubernetes.io/projected/3a8fc6e4-a708-4dc6-b6fa-e357db388623-kube-api-access-jwwfs\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393902 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-systemd\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393944 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.393933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-etc-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394011 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqnsr\" (UniqueName: \"kubernetes.io/projected/6e8405b8-571c-4fb5-8e11-7148ed4e4115-kube-api-access-qqnsr\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394044 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-socket-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394073 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-hostroot\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394099 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-conf-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-run-netns\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394178 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-cni-bin\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-socket-dir-parent\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394325 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-netns\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3829df52-8015-4e76-945f-372f684a4e9c-tmp\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-etc-kubernetes\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394507 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-modprobe-d\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394527 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-kubernetes\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.394556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slcrr\" (UniqueName: \"kubernetes.io/projected/3829df52-8015-4e76-945f-372f684a4e9c-kube-api-access-slcrr\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396187 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-slash\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396077 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396135 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-os-release\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-device-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396222 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-cni-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396258 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2e65a987-99ed-48bc-a17d-431dde198e65-multus-daemon-config\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396283 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-kubelet\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396328 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovn-node-metrics-cert\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396392 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-system-cni-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-sys\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396449 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysconfig\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396477 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysctl-d\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396512 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cnibin\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.396988 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396552 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4phzq\" (UniqueName: \"kubernetes.io/projected/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-kube-api-access-4phzq\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396572 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25d8\" (UniqueName: \"kubernetes.io/projected/f09d386a-3466-46d1-a1d1-efb87cc77eba-kube-api-access-j25d8\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396592 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a8fc6e4-a708-4dc6-b6fa-e357db388623-iptables-alerter-script\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396606 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-cnibin\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396620 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396641 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovnkube-script-lib\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396660 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396674 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396689 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-cni-netd\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovnkube-config\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-etc-selinux\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396743 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a8fc6e4-a708-4dc6-b6fa-e357db388623-host-slash\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396780 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-k8s-cni-cncf-io\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396797 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysctl-conf\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.397586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.396818 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-lib-modules\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.473828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.473787 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:08 +0000 UTC" deadline="2027-12-26 16:33:08.628826074 +0000 UTC" Apr 24 16:39:09.473828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.473820 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14663h53m59.155010049s" Apr 24 16:39:09.479509 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.479483 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 16:39:09.497213 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-registration-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.497349 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497219 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f09d386a-3466-46d1-a1d1-efb87cc77eba-host\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.497349 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497235 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwwfs\" (UniqueName: \"kubernetes.io/projected/3a8fc6e4-a708-4dc6-b6fa-e357db388623-kube-api-access-jwwfs\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.497349 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-systemd\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.497349 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-etc-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.497349 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqnsr\" (UniqueName: \"kubernetes.io/projected/6e8405b8-571c-4fb5-8e11-7148ed4e4115-kube-api-access-qqnsr\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.497349 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497344 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-systemd\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.497349 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-socket-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-registration-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-hostroot\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497407 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a49b4d14-b188-40e4-828a-7109543078dc-dbus\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497302 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f09d386a-3466-46d1-a1d1-efb87cc77eba-host\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497362 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-etc-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497474 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-socket-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497491 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-conf-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-conf-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-run-netns\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497590 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-cni-bin\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.497618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497612 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-hostroot\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497623 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-run-netns\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497619 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f2523fe-21a3-46f7-a03b-88e7ae991338-hosts-file\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497664 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-cni-bin\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497700 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-socket-dir-parent\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497732 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-netns\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3829df52-8015-4e76-945f-372f684a4e9c-tmp\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497793 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-etc-kubernetes\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497805 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-socket-dir-parent\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-modprobe-d\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-kubernetes\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-etc-kubernetes\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-netns\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497882 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-kubernetes\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497874 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slcrr\" (UniqueName: \"kubernetes.io/projected/3829df52-8015-4e76-945f-372f684a4e9c-kube-api-access-slcrr\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497913 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-slash\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-modprobe-d\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.498141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497964 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rbbv\" (UniqueName: \"kubernetes.io/projected/7f2523fe-21a3-46f7-a03b-88e7ae991338-kube-api-access-2rbbv\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497978 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-slash\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497993 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-os-release\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.497981 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498026 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498050 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-device-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498070 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-os-release\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498073 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498074 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-cni-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2e65a987-99ed-48bc-a17d-431dde198e65-multus-daemon-config\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498146 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-multus-cni-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-device-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498170 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498217 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-kubelet\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.498221 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498245 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovn-node-metrics-cert\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498274 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-system-cni-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.498938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-kubelet\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.498301 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:09.998276645 +0000 UTC m=+2.988965456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-system-cni-dir\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498373 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-sys\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498409 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a49b4d14-b188-40e4-828a-7109543078dc-kubelet-config\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-sys\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498459 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysconfig\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysctl-d\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498545 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cnibin\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498555 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysconfig\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysctl-d\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4phzq\" (UniqueName: \"kubernetes.io/projected/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-kube-api-access-4phzq\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498637 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cnibin\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j25d8\" (UniqueName: \"kubernetes.io/projected/f09d386a-3466-46d1-a1d1-efb87cc77eba-kube-api-access-j25d8\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.499828 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a8fc6e4-a708-4dc6-b6fa-e357db388623-iptables-alerter-script\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-cnibin\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498725 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2e65a987-99ed-48bc-a17d-431dde198e65-multus-daemon-config\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-cnibin\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498813 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498841 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovnkube-script-lib\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498881 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498918 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498945 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-cni-netd\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498969 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovnkube-config\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.498997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71ece18d-8c30-46e6-aea9-dad90b2644cb-agent-certs\") pod \"konnectivity-agent-zfvwh\" (UID: \"71ece18d-8c30-46e6-aea9-dad90b2644cb\") " pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-etc-selinux\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a8fc6e4-a708-4dc6-b6fa-e357db388623-host-slash\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499119 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-k8s-cni-cncf-io\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysctl-conf\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499177 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-lib-modules\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.500616 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499204 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71ece18d-8c30-46e6-aea9-dad90b2644cb-konnectivity-ca\") pod \"konnectivity-agent-zfvwh\" (UID: \"71ece18d-8c30-46e6-aea9-dad90b2644cb\") " pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499232 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-system-cni-dir\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/3a8fc6e4-a708-4dc6-b6fa-e357db388623-iptables-alerter-script\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499261 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-cni-bin\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499289 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-cni-multus\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499310 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-cni-netd\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499313 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3829df52-8015-4e76-945f-372f684a4e9c-etc-tuned\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499351 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-run-ovn-kubernetes\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-env-overrides\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499397 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-lib-modules\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499400 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499429 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-etc-sysctl-conf\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499435 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e65a987-99ed-48bc-a17d-431dde198e65-cni-binary-copy\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-var-lib-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499487 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-k8s-cni-cncf-io\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499501 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovnkube-script-lib\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-ovn\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.501413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499545 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-ovn\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499552 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-node-log\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499596 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f2523fe-21a3-46f7-a03b-88e7ae991338-tmp-dir\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-cni-bin\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499627 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09d386a-3466-46d1-a1d1-efb87cc77eba-serviceca\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjfv\" (UniqueName: \"kubernetes.io/projected/f8b7b6cb-c76c-42e3-9193-9423bbd58047-kube-api-access-dbjfv\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qk2v\" (UniqueName: \"kubernetes.io/projected/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-kube-api-access-9qk2v\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-os-release\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499754 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-kubelet\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9n45\" (UniqueName: \"kubernetes.io/projected/2e65a987-99ed-48bc-a17d-431dde198e65-kube-api-access-d9n45\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499972 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-host\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2e65a987-99ed-48bc-a17d-431dde198e65-cni-binary-copy\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f09d386a-3466-46d1-a1d1-efb87cc77eba-serviceca\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-log-socket\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-system-cni-dir\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500065 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-var-lib-openvswitch\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500079 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-node-log\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.499999 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-log-socket\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500140 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500169 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-sys-fs\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500197 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-multus-certs\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500212 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3a8fc6e4-a708-4dc6-b6fa-e357db388623-host-slash\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500221 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-run\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500249 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-host-run-ovn-kubernetes\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500248 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-var-lib-kubelet\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-systemd-units\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500333 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-systemd\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500346 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-sys-fs\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-etc-selinux\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500402 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-systemd-units\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500411 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-run-multus-certs\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500439 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6e8405b8-571c-4fb5-8e11-7148ed4e4115-run-systemd\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500307 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-var-lib-kubelet\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-run\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500500 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-cni-multus\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.502707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500524 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-os-release\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovnkube-config\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500582 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3829df52-8015-4e76-945f-372f684a4e9c-host\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500613 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2e65a987-99ed-48bc-a17d-431dde198e65-host-var-lib-kubelet\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500699 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.500845 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6e8405b8-571c-4fb5-8e11-7148ed4e4115-env-overrides\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.501428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cni-binary-copy\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.501510 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.502039 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/3829df52-8015-4e76-945f-372f684a4e9c-etc-tuned\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.502061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3829df52-8015-4e76-945f-372f684a4e9c-tmp\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.503222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.502064 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6e8405b8-571c-4fb5-8e11-7148ed4e4115-ovn-node-metrics-cert\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.510849 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.510824 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqnsr\" (UniqueName: \"kubernetes.io/projected/6e8405b8-571c-4fb5-8e11-7148ed4e4115-kube-api-access-qqnsr\") pod \"ovnkube-node-m22zz\" (UID: \"6e8405b8-571c-4fb5-8e11-7148ed4e4115\") " pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.512322 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.512301 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qk2v\" (UniqueName: \"kubernetes.io/projected/7a8fd9ce-fceb-4edf-8036-6b698f218fc2-kube-api-access-9qk2v\") pod \"aws-ebs-csi-driver-node-46chz\" (UID: \"7a8fd9ce-fceb-4edf-8036-6b698f218fc2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.512414 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.512321 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25d8\" (UniqueName: \"kubernetes.io/projected/f09d386a-3466-46d1-a1d1-efb87cc77eba-kube-api-access-j25d8\") pod \"node-ca-vvwg9\" (UID: \"f09d386a-3466-46d1-a1d1-efb87cc77eba\") " pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.512584 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.512562 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:09.512649 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.512595 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:09.512649 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.512611 2575 projected.go:194] Error preparing data for projected volume kube-api-access-822vr for pod openshift-network-diagnostics/network-check-target-2sm9w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:09.512778 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.512721 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr podName:2c1005ed-6b92-40fd-a607-3082a407e5c8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:10.01269885 +0000 UTC m=+3.003387682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-822vr" (UniqueName: "kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr") pod "network-check-target-2sm9w" (UID: "2c1005ed-6b92-40fd-a607-3082a407e5c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:09.514652 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.514631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwwfs\" (UniqueName: \"kubernetes.io/projected/3a8fc6e4-a708-4dc6-b6fa-e357db388623-kube-api-access-jwwfs\") pod \"iptables-alerter-b7wgk\" (UID: \"3a8fc6e4-a708-4dc6-b6fa-e357db388623\") " pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.518578 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.515455 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slcrr\" (UniqueName: \"kubernetes.io/projected/3829df52-8015-4e76-945f-372f684a4e9c-kube-api-access-slcrr\") pod \"tuned-p5486\" (UID: \"3829df52-8015-4e76-945f-372f684a4e9c\") " pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.518578 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.515815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9n45\" (UniqueName: \"kubernetes.io/projected/2e65a987-99ed-48bc-a17d-431dde198e65-kube-api-access-d9n45\") pod \"multus-dq9ms\" (UID: \"2e65a987-99ed-48bc-a17d-431dde198e65\") " pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.518578 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.517226 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjfv\" (UniqueName: \"kubernetes.io/projected/f8b7b6cb-c76c-42e3-9193-9423bbd58047-kube-api-access-dbjfv\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:09.519316 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.519299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4phzq\" (UniqueName: \"kubernetes.io/projected/647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b-kube-api-access-4phzq\") pod \"multus-additional-cni-plugins-b9xc2\" (UID: \"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b\") " pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.600815 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600726 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71ece18d-8c30-46e6-aea9-dad90b2644cb-agent-certs\") pod \"konnectivity-agent-zfvwh\" (UID: \"71ece18d-8c30-46e6-aea9-dad90b2644cb\") " pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.600815 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600772 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71ece18d-8c30-46e6-aea9-dad90b2644cb-konnectivity-ca\") pod \"konnectivity-agent-zfvwh\" (UID: \"71ece18d-8c30-46e6-aea9-dad90b2644cb\") " pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.600815 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f2523fe-21a3-46f7-a03b-88e7ae991338-tmp-dir\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.601095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a49b4d14-b188-40e4-828a-7109543078dc-dbus\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.601095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600871 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.601095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600896 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f2523fe-21a3-46f7-a03b-88e7ae991338-hosts-file\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.601095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600926 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rbbv\" (UniqueName: \"kubernetes.io/projected/7f2523fe-21a3-46f7-a03b-88e7ae991338-kube-api-access-2rbbv\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.601095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.600978 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a49b4d14-b188-40e4-828a-7109543078dc-kubelet-config\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.601095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.601048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a49b4d14-b188-40e4-828a-7109543078dc-kubelet-config\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.601402 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.601175 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:09.601402 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:09.601288 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret podName:a49b4d14-b188-40e4-828a-7109543078dc nodeName:}" failed. No retries permitted until 2026-04-24 16:39:10.101266945 +0000 UTC m=+3.091955754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret") pod "global-pull-secret-syncer-pqspn" (UID: "a49b4d14-b188-40e4-828a-7109543078dc") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:09.601402 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.601323 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7f2523fe-21a3-46f7-a03b-88e7ae991338-hosts-file\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.601402 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.601382 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/71ece18d-8c30-46e6-aea9-dad90b2644cb-konnectivity-ca\") pod \"konnectivity-agent-zfvwh\" (UID: \"71ece18d-8c30-46e6-aea9-dad90b2644cb\") " pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.601599 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.601459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a49b4d14-b188-40e4-828a-7109543078dc-dbus\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:09.601694 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.601655 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7f2523fe-21a3-46f7-a03b-88e7ae991338-tmp-dir\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.603909 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.603887 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/71ece18d-8c30-46e6-aea9-dad90b2644cb-agent-certs\") pod \"konnectivity-agent-zfvwh\" (UID: \"71ece18d-8c30-46e6-aea9-dad90b2644cb\") " pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.611526 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.611503 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rbbv\" (UniqueName: \"kubernetes.io/projected/7f2523fe-21a3-46f7-a03b-88e7ae991338-kube-api-access-2rbbv\") pod \"node-resolver-49ml9\" (UID: \"7f2523fe-21a3-46f7-a03b-88e7ae991338\") " pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:09.683038 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.683001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" Apr 24 16:39:09.690949 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.690922 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" Apr 24 16:39:09.700578 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.700555 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-p5486" Apr 24 16:39:09.705202 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.705182 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vvwg9" Apr 24 16:39:09.712680 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.712664 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-b7wgk" Apr 24 16:39:09.719252 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.719235 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:09.725772 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.725748 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-dq9ms" Apr 24 16:39:09.732289 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.732264 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:09.737848 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:09.737831 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-49ml9" Apr 24 16:39:10.004269 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.004237 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:10.004428 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.004393 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:10.004472 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.004442 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.004428863 +0000 UTC m=+3.995117658 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:10.052705 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.052677 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a8fd9ce_fceb_4edf_8036_6b698f218fc2.slice/crio-06291c152a81e2bde820adc2f6f149c727626faa19df896e358f26df4562504a WatchSource:0}: Error finding container 06291c152a81e2bde820adc2f6f149c727626faa19df896e358f26df4562504a: Status 404 returned error can't find the container with id 06291c152a81e2bde820adc2f6f149c727626faa19df896e358f26df4562504a Apr 24 16:39:10.054599 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.054578 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647cb6b6_c1d7_4f0a_bd40_b5e358e9c90b.slice/crio-3017572d5edfcefda7f1a09dd9fe7dec634fa660ccd75fce9a5ae6dcb7c5017c WatchSource:0}: Error finding container 3017572d5edfcefda7f1a09dd9fe7dec634fa660ccd75fce9a5ae6dcb7c5017c: Status 404 returned error can't find the container with id 3017572d5edfcefda7f1a09dd9fe7dec634fa660ccd75fce9a5ae6dcb7c5017c Apr 24 16:39:10.056015 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.055938 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e8405b8_571c_4fb5_8e11_7148ed4e4115.slice/crio-e70e914c8bcf7aad78b1f5b7c5e35aa7d1c4f16953a651881dfb1fbe4f463ed8 WatchSource:0}: Error finding container e70e914c8bcf7aad78b1f5b7c5e35aa7d1c4f16953a651881dfb1fbe4f463ed8: Status 404 returned error can't find the container with id e70e914c8bcf7aad78b1f5b7c5e35aa7d1c4f16953a651881dfb1fbe4f463ed8 Apr 24 16:39:10.056861 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.056801 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3829df52_8015_4e76_945f_372f684a4e9c.slice/crio-07561a0365ee1034358b13412ab01caa3786488112c1c1e8203714feb94f96ac WatchSource:0}: Error finding container 07561a0365ee1034358b13412ab01caa3786488112c1c1e8203714feb94f96ac: Status 404 returned error can't find the container with id 07561a0365ee1034358b13412ab01caa3786488112c1c1e8203714feb94f96ac Apr 24 16:39:10.058366 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.058269 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf09d386a_3466_46d1_a1d1_efb87cc77eba.slice/crio-d59175b060e9a9e1032913d3b6b15519445ea3ff0b5170f5900aeefc94c36520 WatchSource:0}: Error finding container d59175b060e9a9e1032913d3b6b15519445ea3ff0b5170f5900aeefc94c36520: Status 404 returned error can't find the container with id d59175b060e9a9e1032913d3b6b15519445ea3ff0b5170f5900aeefc94c36520 Apr 24 16:39:10.059564 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.058989 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a8fc6e4_a708_4dc6_b6fa_e357db388623.slice/crio-e350f24566266dfa323dd30fbf34b862dddd2f460c61f5b908d91f2699633a56 WatchSource:0}: Error finding container e350f24566266dfa323dd30fbf34b862dddd2f460c61f5b908d91f2699633a56: Status 404 returned error can't find the container with id e350f24566266dfa323dd30fbf34b862dddd2f460c61f5b908d91f2699633a56 Apr 24 16:39:10.060978 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.060958 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f2523fe_21a3_46f7_a03b_88e7ae991338.slice/crio-820df0f8c1f6ddb9b3be3e24687c9cbcc6bc25eb0472cc98054bd8ea310ea17a WatchSource:0}: Error finding container 820df0f8c1f6ddb9b3be3e24687c9cbcc6bc25eb0472cc98054bd8ea310ea17a: Status 404 returned error can't find the container with id 820df0f8c1f6ddb9b3be3e24687c9cbcc6bc25eb0472cc98054bd8ea310ea17a Apr 24 16:39:10.062160 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.061941 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71ece18d_8c30_46e6_aea9_dad90b2644cb.slice/crio-b81e2a79491679ad064a56c586a26f1fa05f6524b81e9a7560594b9098e50c09 WatchSource:0}: Error finding container b81e2a79491679ad064a56c586a26f1fa05f6524b81e9a7560594b9098e50c09: Status 404 returned error can't find the container with id b81e2a79491679ad064a56c586a26f1fa05f6524b81e9a7560594b9098e50c09 Apr 24 16:39:10.063386 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:10.063358 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e65a987_99ed_48bc_a17d_431dde198e65.slice/crio-00cef1632074f35a271902dbda38a2d81d0806619c55d87db192e77507ec2b7c WatchSource:0}: Error finding container 00cef1632074f35a271902dbda38a2d81d0806619c55d87db192e77507ec2b7c: Status 404 returned error can't find the container with id 00cef1632074f35a271902dbda38a2d81d0806619c55d87db192e77507ec2b7c Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.104782 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.104845 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.104987 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.105000 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.105009 2575 projected.go:194] Error preparing data for projected volume kube-api-access-822vr for pod openshift-network-diagnostics/network-check-target-2sm9w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.105069 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr podName:2c1005ed-6b92-40fd-a607-3082a407e5c8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.105055691 +0000 UTC m=+4.095744487 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-822vr" (UniqueName: "kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr") pod "network-check-target-2sm9w" (UID: "2c1005ed-6b92-40fd-a607-3082a407e5c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.105170 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:10.107129 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.105211 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret podName:a49b4d14-b188-40e4-828a-7109543078dc nodeName:}" failed. No retries permitted until 2026-04-24 16:39:11.105200617 +0000 UTC m=+4.095889415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret") pod "global-pull-secret-syncer-pqspn" (UID: "a49b4d14-b188-40e4-828a-7109543078dc") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:10.475056 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.474989 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 16:34:08 +0000 UTC" deadline="2028-02-08 22:33:17.428186328 +0000 UTC" Apr 24 16:39:10.475056 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.475032 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15725h54m6.953157889s" Apr 24 16:39:10.505886 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.505384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:10.505886 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:10.505507 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:10.525631 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.525562 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b7wgk" event={"ID":"3a8fc6e4-a708-4dc6-b6fa-e357db388623","Type":"ContainerStarted","Data":"e350f24566266dfa323dd30fbf34b862dddd2f460c61f5b908d91f2699633a56"} Apr 24 16:39:10.534141 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.534060 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvwg9" event={"ID":"f09d386a-3466-46d1-a1d1-efb87cc77eba","Type":"ContainerStarted","Data":"d59175b060e9a9e1032913d3b6b15519445ea3ff0b5170f5900aeefc94c36520"} Apr 24 16:39:10.546532 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.546480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p5486" event={"ID":"3829df52-8015-4e76-945f-372f684a4e9c","Type":"ContainerStarted","Data":"07561a0365ee1034358b13412ab01caa3786488112c1c1e8203714feb94f96ac"} Apr 24 16:39:10.556917 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.556857 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"e70e914c8bcf7aad78b1f5b7c5e35aa7d1c4f16953a651881dfb1fbe4f463ed8"} Apr 24 16:39:10.559693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.559627 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zfvwh" event={"ID":"71ece18d-8c30-46e6-aea9-dad90b2644cb","Type":"ContainerStarted","Data":"b81e2a79491679ad064a56c586a26f1fa05f6524b81e9a7560594b9098e50c09"} Apr 24 16:39:10.563154 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.563069 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-49ml9" event={"ID":"7f2523fe-21a3-46f7-a03b-88e7ae991338","Type":"ContainerStarted","Data":"820df0f8c1f6ddb9b3be3e24687c9cbcc6bc25eb0472cc98054bd8ea310ea17a"} Apr 24 16:39:10.568293 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.568258 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerStarted","Data":"3017572d5edfcefda7f1a09dd9fe7dec634fa660ccd75fce9a5ae6dcb7c5017c"} Apr 24 16:39:10.582708 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.582680 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" event={"ID":"7a8fd9ce-fceb-4edf-8036-6b698f218fc2","Type":"ContainerStarted","Data":"06291c152a81e2bde820adc2f6f149c727626faa19df896e358f26df4562504a"} Apr 24 16:39:10.596471 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.596434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" event={"ID":"d5a1f5c4f174a8faa48510e8386159fc","Type":"ContainerStarted","Data":"281afb9a8e87fbab8c6ef5671bd8a7c25fdb8b686c482bb2916d2b84350e1bae"} Apr 24 16:39:10.616435 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:10.616385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dq9ms" event={"ID":"2e65a987-99ed-48bc-a17d-431dde198e65","Type":"ContainerStarted","Data":"00cef1632074f35a271902dbda38a2d81d0806619c55d87db192e77507ec2b7c"} Apr 24 16:39:11.014249 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.014210 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:11.014451 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.014379 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:11.014451 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.014440 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.014421201 +0000 UTC m=+6.005109998 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:11.115012 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.114980 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:11.115168 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.115051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:11.115234 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.115224 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:11.115296 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.115245 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:11.115296 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.115257 2575 projected.go:194] Error preparing data for projected volume kube-api-access-822vr for pod openshift-network-diagnostics/network-check-target-2sm9w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:11.115387 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.115312 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr podName:2c1005ed-6b92-40fd-a607-3082a407e5c8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.115294366 +0000 UTC m=+6.105983182 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-822vr" (UniqueName: "kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr") pod "network-check-target-2sm9w" (UID: "2c1005ed-6b92-40fd-a607-3082a407e5c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:11.115701 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.115684 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:11.115766 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.115733 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret podName:a49b4d14-b188-40e4-828a-7109543078dc nodeName:}" failed. No retries permitted until 2026-04-24 16:39:13.11571946 +0000 UTC m=+6.106408258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret") pod "global-pull-secret-syncer-pqspn" (UID: "a49b4d14-b188-40e4-828a-7109543078dc") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:11.504511 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.504481 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:11.504977 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.504605 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:11.505039 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.505002 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:11.505209 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:11.505097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:11.644474 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.644438 2575 generic.go:358] "Generic (PLEG): container finished" podID="7ba611364bea52f53a72b078e3fdc49f" containerID="86d51ea667debb1cc441122c5652851766c8e6eafec89729ba858496dfed0d4c" exitCode=0 Apr 24 16:39:11.645201 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.645169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" event={"ID":"7ba611364bea52f53a72b078e3fdc49f","Type":"ContainerDied","Data":"86d51ea667debb1cc441122c5652851766c8e6eafec89729ba858496dfed0d4c"} Apr 24 16:39:11.660297 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:11.660235 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-83.ec2.internal" podStartSLOduration=3.660218419 podStartE2EDuration="3.660218419s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:10.616309937 +0000 UTC m=+3.606998756" watchObservedRunningTime="2026-04-24 16:39:11.660218419 +0000 UTC m=+4.650907237" Apr 24 16:39:12.505740 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:12.505224 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:12.505740 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:12.505350 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:12.654589 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:12.654551 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" event={"ID":"7ba611364bea52f53a72b078e3fdc49f","Type":"ContainerStarted","Data":"eac0a5f8071916e4e1e8a0fddbac99e3b8a3b03dd764673d01d165f87be6c7cd"} Apr 24 16:39:12.682312 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:12.682251 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-83.ec2.internal" podStartSLOduration=4.68222876 podStartE2EDuration="4.68222876s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:39:12.681866493 +0000 UTC m=+5.672555311" watchObservedRunningTime="2026-04-24 16:39:12.68222876 +0000 UTC m=+5.672917569" Apr 24 16:39:13.031517 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:13.030787 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:13.031517 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.031025 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:13.031517 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.031091 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:17.031071597 +0000 UTC m=+10.021760397 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:13.132093 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:13.132039 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:13.132254 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:13.132152 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:13.132422 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.132404 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:13.132484 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.132430 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:13.132484 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.132444 2575 projected.go:194] Error preparing data for projected volume kube-api-access-822vr for pod openshift-network-diagnostics/network-check-target-2sm9w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:13.132563 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.132507 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr podName:2c1005ed-6b92-40fd-a607-3082a407e5c8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:17.132487682 +0000 UTC m=+10.123176489 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-822vr" (UniqueName: "kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr") pod "network-check-target-2sm9w" (UID: "2c1005ed-6b92-40fd-a607-3082a407e5c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:13.132621 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.132594 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:13.132656 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.132628 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret podName:a49b4d14-b188-40e4-828a-7109543078dc nodeName:}" failed. No retries permitted until 2026-04-24 16:39:17.132617557 +0000 UTC m=+10.123306356 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret") pod "global-pull-secret-syncer-pqspn" (UID: "a49b4d14-b188-40e4-828a-7109543078dc") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:13.508905 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:13.508861 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:13.509339 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.508993 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:13.509409 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:13.509384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:13.509520 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:13.509480 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:14.505066 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:14.505025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:14.505252 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:14.505178 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:15.505019 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:15.504949 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:15.505477 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:15.505087 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:15.505477 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:15.505149 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:15.505477 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:15.505290 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:16.505520 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:16.504984 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:16.505520 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:16.505147 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:17.065073 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:17.065036 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:17.065282 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.065236 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:17.065344 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.065293 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.065280571 +0000 UTC m=+18.055969371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:17.166871 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:17.166822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:17.167065 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:17.166932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:17.171138 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.169270 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:17.171138 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.169364 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret podName:a49b4d14-b188-40e4-828a-7109543078dc nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.169340909 +0000 UTC m=+18.160029708 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret") pod "global-pull-secret-syncer-pqspn" (UID: "a49b4d14-b188-40e4-828a-7109543078dc") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:17.171138 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.169488 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:17.171138 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.169509 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:17.171138 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.169529 2575 projected.go:194] Error preparing data for projected volume kube-api-access-822vr for pod openshift-network-diagnostics/network-check-target-2sm9w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:17.171138 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.169585 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr podName:2c1005ed-6b92-40fd-a607-3082a407e5c8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:25.169563834 +0000 UTC m=+18.160252646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-822vr" (UniqueName: "kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr") pod "network-check-target-2sm9w" (UID: "2c1005ed-6b92-40fd-a607-3082a407e5c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:17.506371 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:17.506335 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:17.506821 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.506466 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:17.506821 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:17.506520 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:17.506821 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:17.506701 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:18.505051 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:18.505015 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:18.505249 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:18.505146 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:19.505276 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:19.505240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:19.505709 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:19.505239 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:19.505709 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:19.505386 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:19.505709 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:19.505519 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:20.504843 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:20.504808 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:20.505025 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:20.504922 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:21.505301 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:21.505268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:21.505751 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:21.505268 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:21.505751 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:21.505415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:21.505751 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:21.505468 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:22.505118 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:22.505070 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:22.505310 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:22.505218 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:23.505374 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:23.505337 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:23.505759 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:23.505474 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:23.505759 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:23.505525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:23.505759 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:23.505641 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:24.504904 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:24.504868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:24.505096 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:24.504991 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:25.130622 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:25.130580 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:25.130983 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.130724 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.130983 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.130773 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.130759548 +0000 UTC m=+34.121448343 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:25.230984 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:25.230944 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:25.231158 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:25.231014 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:25.231158 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.231132 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:25.231158 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.231139 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:25.231312 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.231171 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:25.231312 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.231185 2575 projected.go:194] Error preparing data for projected volume kube-api-access-822vr for pod openshift-network-diagnostics/network-check-target-2sm9w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:25.231312 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.231195 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret podName:a49b4d14-b188-40e4-828a-7109543078dc nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.231177374 +0000 UTC m=+34.221866187 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret") pod "global-pull-secret-syncer-pqspn" (UID: "a49b4d14-b188-40e4-828a-7109543078dc") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:25.231312 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.231236 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr podName:2c1005ed-6b92-40fd-a607-3082a407e5c8 nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.231219976 +0000 UTC m=+34.221908780 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-822vr" (UniqueName: "kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr") pod "network-check-target-2sm9w" (UID: "2c1005ed-6b92-40fd-a607-3082a407e5c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:25.504763 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:25.504716 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:25.504945 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:25.504767 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:25.504945 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.504856 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:25.505050 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:25.505005 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:26.504830 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:26.504793 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:26.505287 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:26.504902 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:27.506288 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:27.506217 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:27.506614 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:27.506341 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:27.506614 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:27.506337 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:27.506614 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:27.506469 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:28.504912 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.504574 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:28.505051 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:28.505005 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:28.685956 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.685875 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" event={"ID":"7a8fd9ce-fceb-4edf-8036-6b698f218fc2","Type":"ContainerStarted","Data":"3f14feda4f4d00b55954fe9abf6e587f9b1ea551a61f354aa518d86c855c726a"} Apr 24 16:39:28.687432 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.687349 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-dq9ms" event={"ID":"2e65a987-99ed-48bc-a17d-431dde198e65","Type":"ContainerStarted","Data":"97dce7b4ad25ca1cc97bf0b8f13bab8d02d1edb68e23426af9c7b1ded7b161a5"} Apr 24 16:39:28.688675 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.688646 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vvwg9" event={"ID":"f09d386a-3466-46d1-a1d1-efb87cc77eba","Type":"ContainerStarted","Data":"888d34b9119b7e864dd6fe91c73c5a759a483241dc19c45c052391e4baad3f4a"} Apr 24 16:39:28.690024 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.690000 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-p5486" event={"ID":"3829df52-8015-4e76-945f-372f684a4e9c","Type":"ContainerStarted","Data":"2baff6886469964769cd50cbebf0c6a57834bd5766d99dad338b840b06ba9d85"} Apr 24 16:39:28.692619 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.692599 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:39:28.692936 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.692917 2575 generic.go:358] "Generic (PLEG): container finished" podID="6e8405b8-571c-4fb5-8e11-7148ed4e4115" containerID="a1d671b2110d96277b86b9ee5cc7bcc9f02b35e3034772dade5e4fbe74a3f27a" exitCode=1 Apr 24 16:39:28.693013 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.692980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"ec41856736fa5197067fae7b071f3f7b8446f9d4994120a7cbe1c5d13fec59f4"} Apr 24 16:39:28.693013 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.693004 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"b4c1f4c31e7cdecf6dddb225206aaef5f726aaa19e0bfaf865426df0552dd01d"} Apr 24 16:39:28.693132 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.693017 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"e4959c37e15fba24dadd876b337857fc03c8cd04584e2150840e6ffd12306bcc"} Apr 24 16:39:28.693132 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.693031 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"e0d0bb6e97c9ac395d8cbb28304cd682b5c61f214a8f8f7e16c7cfe13ddd0417"} Apr 24 16:39:28.693132 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.693040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerDied","Data":"a1d671b2110d96277b86b9ee5cc7bcc9f02b35e3034772dade5e4fbe74a3f27a"} Apr 24 16:39:28.693132 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.693049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"ffd9bb9b19c7eca32311be630d002af1f0578f29128563cb3d977805dded3ac0"} Apr 24 16:39:28.694453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.694434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-zfvwh" event={"ID":"71ece18d-8c30-46e6-aea9-dad90b2644cb","Type":"ContainerStarted","Data":"05b063f8b46d5ee082940167568e23d6adc310ab29b31f78cce46e75c13baa4c"} Apr 24 16:39:28.696151 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.696129 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-49ml9" event={"ID":"7f2523fe-21a3-46f7-a03b-88e7ae991338","Type":"ContainerStarted","Data":"e5e8e27cb9849cee83f08081245e044cf7e9795bf3fed99add774773b46b747f"} Apr 24 16:39:28.697506 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.697483 2575 generic.go:358] "Generic (PLEG): container finished" podID="647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b" containerID="92fb19559f99d48ebf42ad1c629ca8fe8ac30512e5de839663649fc712161dfd" exitCode=0 Apr 24 16:39:28.697591 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.697511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerDied","Data":"92fb19559f99d48ebf42ad1c629ca8fe8ac30512e5de839663649fc712161dfd"} Apr 24 16:39:28.728471 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.728421 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-dq9ms" podStartSLOduration=4.19500968 podStartE2EDuration="21.72840785s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.065319868 +0000 UTC m=+3.056008662" lastFinishedPulling="2026-04-24 16:39:27.598718036 +0000 UTC m=+20.589406832" observedRunningTime="2026-04-24 16:39:28.706894289 +0000 UTC m=+21.697583109" watchObservedRunningTime="2026-04-24 16:39:28.72840785 +0000 UTC m=+21.719096730" Apr 24 16:39:28.740766 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.740710 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vvwg9" podStartSLOduration=4.573719111 podStartE2EDuration="21.740696675s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.060048632 +0000 UTC m=+3.050737439" lastFinishedPulling="2026-04-24 16:39:27.2270262 +0000 UTC m=+20.217715003" observedRunningTime="2026-04-24 16:39:28.74020799 +0000 UTC m=+21.730896808" watchObservedRunningTime="2026-04-24 16:39:28.740696675 +0000 UTC m=+21.731385495" Apr 24 16:39:28.752081 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.752038 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-zfvwh" podStartSLOduration=9.052989978 podStartE2EDuration="21.75202959s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.06412029 +0000 UTC m=+3.054809099" lastFinishedPulling="2026-04-24 16:39:22.763159913 +0000 UTC m=+15.753848711" observedRunningTime="2026-04-24 16:39:28.752020479 +0000 UTC m=+21.742709296" watchObservedRunningTime="2026-04-24 16:39:28.75202959 +0000 UTC m=+21.742718400" Apr 24 16:39:28.794669 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.794622 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-p5486" podStartSLOduration=4.288626007 podStartE2EDuration="21.794607134s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.058703755 +0000 UTC m=+3.049392556" lastFinishedPulling="2026-04-24 16:39:27.564684873 +0000 UTC m=+20.555373683" observedRunningTime="2026-04-24 16:39:28.777453582 +0000 UTC m=+21.768142406" watchObservedRunningTime="2026-04-24 16:39:28.794607134 +0000 UTC m=+21.785295958" Apr 24 16:39:28.794924 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:28.794904 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-49ml9" podStartSLOduration=4.292832132 podStartE2EDuration="21.794900475s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.062696742 +0000 UTC m=+3.053385550" lastFinishedPulling="2026-04-24 16:39:27.564765094 +0000 UTC m=+20.555453893" observedRunningTime="2026-04-24 16:39:28.794266274 +0000 UTC m=+21.784955091" watchObservedRunningTime="2026-04-24 16:39:28.794900475 +0000 UTC m=+21.785589292" Apr 24 16:39:29.178119 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.177903 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 16:39:29.455291 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.455184 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T16:39:29.178099526Z","UUID":"24035e4a-7ab4-4094-997e-f77cda8cd5fc","Handler":null,"Name":"","Endpoint":""} Apr 24 16:39:29.456921 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.456895 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 16:39:29.457032 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.456945 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 16:39:29.504746 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.504714 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:29.504886 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.504753 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:29.504886 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:29.504847 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:29.504988 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:29.504970 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:29.701438 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.701400 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" event={"ID":"7a8fd9ce-fceb-4edf-8036-6b698f218fc2","Type":"ContainerStarted","Data":"1763ca193e2ad44fa3c01b3ed19a0fecff9f4226fddfd25409e988567c3a333a"} Apr 24 16:39:29.702894 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.702873 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-b7wgk" event={"ID":"3a8fc6e4-a708-4dc6-b6fa-e357db388623","Type":"ContainerStarted","Data":"c81b541ac33e5b019cdbb877548092fce4516a2673fc8fe67a0ad6f9f8489e28"} Apr 24 16:39:29.717044 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:29.716956 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-b7wgk" podStartSLOduration=5.212773272 podStartE2EDuration="22.716945098s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.060642969 +0000 UTC m=+3.051331764" lastFinishedPulling="2026-04-24 16:39:27.564814794 +0000 UTC m=+20.555503590" observedRunningTime="2026-04-24 16:39:29.716516135 +0000 UTC m=+22.707204953" watchObservedRunningTime="2026-04-24 16:39:29.716945098 +0000 UTC m=+22.707633914" Apr 24 16:39:30.504386 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:30.504360 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:30.504540 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:30.504479 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:30.706904 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:30.706872 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" event={"ID":"7a8fd9ce-fceb-4edf-8036-6b698f218fc2","Type":"ContainerStarted","Data":"98cdcf36db5b0f216b29f681d8b3014bbd3518512d0681eabb5514643e87f76c"} Apr 24 16:39:30.709763 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:30.709738 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:39:30.710119 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:30.710080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"230aaa97c3bd16224aa4bc6a97af34cedd9c0290558ed0bba5bfd02ff6bc12d5"} Apr 24 16:39:31.071785 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:31.071702 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:31.072329 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:31.072306 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:31.088559 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:31.088524 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-46chz" podStartSLOduration=3.717460055 podStartE2EDuration="24.088511528s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.054338291 +0000 UTC m=+3.045027100" lastFinishedPulling="2026-04-24 16:39:30.425389763 +0000 UTC m=+23.416078573" observedRunningTime="2026-04-24 16:39:30.73965374 +0000 UTC m=+23.730342557" watchObservedRunningTime="2026-04-24 16:39:31.088511528 +0000 UTC m=+24.079200370" Apr 24 16:39:31.505384 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:31.505339 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:31.505562 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:31.505463 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:31.505562 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:31.505522 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:31.505678 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:31.505620 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:31.712017 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:31.711967 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:31.712661 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:31.712644 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-zfvwh" Apr 24 16:39:32.505330 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:32.505289 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:32.505495 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:32.505422 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:33.504872 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.504676 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:33.505432 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.504700 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:33.505432 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:33.504967 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:33.505432 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:33.505013 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:33.718891 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.718868 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:39:33.719237 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.719214 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"1a2c0acd53be6bdc25a83217e4e1ee719bea39c96c86c2f275b10157eaec2cd2"} Apr 24 16:39:33.719532 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.719510 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:33.719701 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.719685 2575 scope.go:117] "RemoveContainer" containerID="a1d671b2110d96277b86b9ee5cc7bcc9f02b35e3034772dade5e4fbe74a3f27a" Apr 24 16:39:33.721008 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.720986 2575 generic.go:358] "Generic (PLEG): container finished" podID="647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b" containerID="732471a75a8eb632efa8ea8675ff5411ac889b0abe0763f31bba2d43e92b1a2e" exitCode=0 Apr 24 16:39:33.721080 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.721062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerDied","Data":"732471a75a8eb632efa8ea8675ff5411ac889b0abe0763f31bba2d43e92b1a2e"} Apr 24 16:39:33.736441 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:33.736420 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:34.504618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.504591 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:34.504705 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:34.504682 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:34.622967 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.622934 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pqspn"] Apr 24 16:39:34.623540 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.623071 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:34.623540 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:34.623213 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:34.625271 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.625236 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2sm9w"] Apr 24 16:39:34.628486 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.628427 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tgkjm"] Apr 24 16:39:34.628588 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.628550 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:34.628677 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:34.628658 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:34.725178 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.725148 2575 generic.go:358] "Generic (PLEG): container finished" podID="647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b" containerID="d5f5952348c5f2c2806d16c2ee5bc7688fc8461f96d373587cf640e96ac0fa64" exitCode=0 Apr 24 16:39:34.725316 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.725231 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerDied","Data":"d5f5952348c5f2c2806d16c2ee5bc7688fc8461f96d373587cf640e96ac0fa64"} Apr 24 16:39:34.728663 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.728614 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:39:34.729015 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.729001 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:34.729090 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.729029 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" event={"ID":"6e8405b8-571c-4fb5-8e11-7148ed4e4115","Type":"ContainerStarted","Data":"8d62a95d351fb42519880bb0f0e3d7c72b66a16dc00f4c26e3d4d82217503b5b"} Apr 24 16:39:34.729211 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:34.729086 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:34.729267 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.729218 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:34.729267 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.729266 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:34.744009 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.743989 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:39:34.786386 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:34.786343 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" podStartSLOduration=10.185477724 podStartE2EDuration="27.786332595s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.057631874 +0000 UTC m=+3.048320672" lastFinishedPulling="2026-04-24 16:39:27.658486743 +0000 UTC m=+20.649175543" observedRunningTime="2026-04-24 16:39:34.78434552 +0000 UTC m=+27.775034347" watchObservedRunningTime="2026-04-24 16:39:34.786332595 +0000 UTC m=+27.777021411" Apr 24 16:39:35.732477 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:35.732285 2575 generic.go:358] "Generic (PLEG): container finished" podID="647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b" containerID="7c7cb7ded0c988312ba34f9c4ed9334ec03ea7180e51f5a1c9dd7b232b973d69" exitCode=0 Apr 24 16:39:35.733040 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:35.732379 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerDied","Data":"7c7cb7ded0c988312ba34f9c4ed9334ec03ea7180e51f5a1c9dd7b232b973d69"} Apr 24 16:39:36.504953 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:36.504924 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:36.505160 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:36.505032 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:36.505160 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:36.505046 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:36.505160 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:36.505067 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:36.505315 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:36.505158 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:36.505315 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:36.505233 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:38.504920 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:38.504885 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:38.504920 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:38.504907 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:38.505481 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:38.504935 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:38.505481 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:38.505023 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:38.505481 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:38.505160 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:38.505481 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:38.505230 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:40.505224 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:40.505172 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:40.505673 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:40.505302 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:40.505673 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:40.505325 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2sm9w" podUID="2c1005ed-6b92-40fd-a607-3082a407e5c8" Apr 24 16:39:40.505673 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:40.505415 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:39:40.505673 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:40.505459 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:40.505673 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:40.505538 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pqspn" podUID="a49b4d14-b188-40e4-828a-7109543078dc" Apr 24 16:39:40.855521 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:40.855486 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-83.ec2.internal" event="NodeReady" Apr 24 16:39:40.855699 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:40.855635 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 16:39:41.014747 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.014716 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-b68bd"] Apr 24 16:39:41.050976 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.050949 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5nfqq"] Apr 24 16:39:41.051169 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.051136 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.054070 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.054041 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 16:39:41.054815 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.054797 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 16:39:41.055088 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.055071 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6n2jd\"" Apr 24 16:39:41.068588 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.068562 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b68bd"] Apr 24 16:39:41.068693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.068594 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5nfqq"] Apr 24 16:39:41.068735 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.068701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:41.071579 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.071561 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 16:39:41.071781 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.071766 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-szcnf\"" Apr 24 16:39:41.071890 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.071864 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 16:39:41.072014 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.071995 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 16:39:41.150422 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.150341 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:41.150422 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.150391 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-config-volume\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.150626 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.150425 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:41.150626 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.150464 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6j6j\" (UniqueName: \"kubernetes.io/projected/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-kube-api-access-c6j6j\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.150626 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.150501 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.150626 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.150570 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:41.150785 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.150628 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnszn\" (UniqueName: \"kubernetes.io/projected/b0e3d259-e5e4-4160-8258-8d97913d476a-kube-api-access-wnszn\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:41.150785 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.150633 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:13.150613301 +0000 UTC m=+66.141302100 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 16:39:41.150785 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.150682 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-tmp-dir\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.251826 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.251792 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.251826 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.251828 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnszn\" (UniqueName: \"kubernetes.io/projected/b0e3d259-e5e4-4160-8258-8d97913d476a-kube-api-access-wnszn\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.251857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-tmp-dir\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.251882 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.251897 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.251959 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.251974 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.251975 2575 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.252019 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.752006087 +0000 UTC m=+34.742694882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.252034 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:41.752025874 +0000 UTC m=+34.742714669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:39:41.252057 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.252049 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret podName:a49b4d14-b188-40e4-828a-7109543078dc nodeName:}" failed. No retries permitted until 2026-04-24 16:40:13.25204109 +0000 UTC m=+66.242729884 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret") pod "global-pull-secret-syncer-pqspn" (UID: "a49b4d14-b188-40e4-828a-7109543078dc") : object "kube-system"/"original-pull-secret" not registered Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.252065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-config-volume\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.252130 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.252163 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6j6j\" (UniqueName: \"kubernetes.io/projected/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-kube-api-access-c6j6j\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.252164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-tmp-dir\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.252271 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.252282 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.252291 2575 projected.go:194] Error preparing data for projected volume kube-api-access-822vr for pod openshift-network-diagnostics/network-check-target-2sm9w: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:41.252457 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.252332 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr podName:2c1005ed-6b92-40fd-a607-3082a407e5c8 nodeName:}" failed. No retries permitted until 2026-04-24 16:40:13.252319734 +0000 UTC m=+66.243008528 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-822vr" (UniqueName: "kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr") pod "network-check-target-2sm9w" (UID: "2c1005ed-6b92-40fd-a607-3082a407e5c8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 16:39:41.252689 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.252583 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-config-volume\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.266292 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.266161 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6j6j\" (UniqueName: \"kubernetes.io/projected/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-kube-api-access-c6j6j\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.266410 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.266334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnszn\" (UniqueName: \"kubernetes.io/projected/b0e3d259-e5e4-4160-8258-8d97913d476a-kube-api-access-wnszn\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:41.746684 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.746644 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerStarted","Data":"fcf659860427abecd4568d7e8f7e7ba0ec20cb74f39be1841682c45296f4c626"} Apr 24 16:39:41.756720 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.756699 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:41.756804 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:41.756774 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:41.756868 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.756852 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:41.756868 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.756859 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:41.756942 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.756916 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.756899739 +0000 UTC m=+35.747588536 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:39:41.756942 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:41.756934 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:39:42.756925648 +0000 UTC m=+35.747614443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:39:42.504620 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.504585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:39:42.504773 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.504585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:39:42.504851 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.504585 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:39:42.509264 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.509243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:39:42.509394 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.509261 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:39:42.509394 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.509273 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xv2x8\"" Apr 24 16:39:42.509394 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.509243 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:39:42.509394 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.509243 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-v7jl2\"" Apr 24 16:39:42.509588 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.509243 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:39:42.750244 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.750209 2575 generic.go:358] "Generic (PLEG): container finished" podID="647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b" containerID="fcf659860427abecd4568d7e8f7e7ba0ec20cb74f39be1841682c45296f4c626" exitCode=0 Apr 24 16:39:42.750685 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.750272 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerDied","Data":"fcf659860427abecd4568d7e8f7e7ba0ec20cb74f39be1841682c45296f4c626"} Apr 24 16:39:42.765730 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.765684 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:42.765808 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:42.765740 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:42.765845 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:42.765815 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:42.765845 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:42.765834 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:42.765908 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:42.765864 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:39:44.765850645 +0000 UTC m=+37.756539439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:39:42.765908 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:42.765882 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:44.765871254 +0000 UTC m=+37.756560048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:39:43.754230 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:43.754194 2575 generic.go:358] "Generic (PLEG): container finished" podID="647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b" containerID="1a60e0eb91d329cd09888c5318c4e8f2d217e341ffb11b0bf03d90ec983475ad" exitCode=0 Apr 24 16:39:43.754705 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:43.754256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerDied","Data":"1a60e0eb91d329cd09888c5318c4e8f2d217e341ffb11b0bf03d90ec983475ad"} Apr 24 16:39:44.758867 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:44.758833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" event={"ID":"647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b","Type":"ContainerStarted","Data":"8d16ebac4f02347f72d6d24aae40f9fcbd1eee894a5b5370e6e985fa4796e27e"} Apr 24 16:39:44.780627 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:44.780603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:44.780722 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:44.780666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:44.780761 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:44.780740 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:44.780761 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:44.780745 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:44.780821 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:44.780809 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:39:48.780794055 +0000 UTC m=+41.771482850 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:39:44.780860 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:44.780823 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:48.7808169 +0000 UTC m=+41.771505695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:39:44.784475 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:44.784433 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b9xc2" podStartSLOduration=6.298611054 podStartE2EDuration="37.784422528s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:39:10.056551954 +0000 UTC m=+3.047240752" lastFinishedPulling="2026-04-24 16:39:41.54236343 +0000 UTC m=+34.533052226" observedRunningTime="2026-04-24 16:39:44.782784916 +0000 UTC m=+37.773473733" watchObservedRunningTime="2026-04-24 16:39:44.784422528 +0000 UTC m=+37.775111345" Apr 24 16:39:48.808084 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:48.808051 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:48.808498 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:48.808118 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:48.808498 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:48.808204 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:48.808498 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:48.808210 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:48.808498 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:48.808253 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.808239687 +0000 UTC m=+49.798928481 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:39:48.808498 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:48.808271 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:39:56.80825828 +0000 UTC m=+49.798947075 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:39:56.863061 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:56.863027 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:39:56.863588 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:56.863099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:39:56.863588 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:56.863212 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:39:56.863588 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:56.863228 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:39:56.863588 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:56.863276 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:40:12.863261705 +0000 UTC m=+65.853950503 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:39:56.863588 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:39:56.863299 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:40:12.863283483 +0000 UTC m=+65.853972286 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:39:57.743036 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.742981 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc"] Apr 24 16:39:57.784633 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.784601 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9"] Apr 24 16:39:57.784803 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.784773 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.787711 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.787693 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 16:39:57.787892 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.787741 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 16:39:57.787987 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.787911 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 16:39:57.787987 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.787966 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 16:39:57.805516 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.805496 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc"] Apr 24 16:39:57.805516 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.805518 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9"] Apr 24 16:39:57.805644 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.805598 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.808385 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.808366 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 16:39:57.808734 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.808718 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 16:39:57.808800 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.808737 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 16:39:57.808900 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.808880 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 16:39:57.870364 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870331 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-ca\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870454 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29jc6\" (UniqueName: \"kubernetes.io/projected/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-kube-api-access-29jc6\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870482 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870504 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0351ca2-fa39-42bc-b374-b53d08923ed5-tmp\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870523 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8cpp\" (UniqueName: \"kubernetes.io/projected/d0351ca2-fa39-42bc-b374-b53d08923ed5-kube-api-access-b8cpp\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870581 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870601 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d0351ca2-fa39-42bc-b374-b53d08923ed5-klusterlet-config\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.870693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.870651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-hub\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.971445 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971410 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-ca\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.971607 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.971607 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971492 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-29jc6\" (UniqueName: \"kubernetes.io/projected/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-kube-api-access-29jc6\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.971607 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.971607 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971535 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0351ca2-fa39-42bc-b374-b53d08923ed5-tmp\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.971607 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8cpp\" (UniqueName: \"kubernetes.io/projected/d0351ca2-fa39-42bc-b374-b53d08923ed5-kube-api-access-b8cpp\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.971841 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971666 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.971841 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d0351ca2-fa39-42bc-b374-b53d08923ed5-klusterlet-config\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.971943 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971875 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-hub\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.972006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.971952 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d0351ca2-fa39-42bc-b374-b53d08923ed5-tmp\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.972357 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.972334 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.975301 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.975274 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.975391 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.975277 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-ca\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.975391 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.975320 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-hub\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.975461 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.975400 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d0351ca2-fa39-42bc-b374-b53d08923ed5-klusterlet-config\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:57.980534 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.980507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-29jc6\" (UniqueName: \"kubernetes.io/projected/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-kube-api-access-29jc6\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.986678 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.986658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/aaba430d-6bd1-4bba-b94f-f0ca122b7f17-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-6f7f895d88-lbst9\" (UID: \"aaba430d-6bd1-4bba-b94f-f0ca122b7f17\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:57.992535 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:57.992519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8cpp\" (UniqueName: \"kubernetes.io/projected/d0351ca2-fa39-42bc-b374-b53d08923ed5-kube-api-access-b8cpp\") pod \"klusterlet-addon-workmgr-68f9c6d688-t56dc\" (UID: \"d0351ca2-fa39-42bc-b374-b53d08923ed5\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:58.094485 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:58.094412 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:39:58.122441 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:58.122409 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:39:58.257887 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:58.257857 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9"] Apr 24 16:39:58.261144 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:58.261098 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaba430d_6bd1_4bba_b94f_f0ca122b7f17.slice/crio-4bcabf008a050ae154234e46b2c9b03fe1b2ecb297f2e9767d542ba51f11b4b8 WatchSource:0}: Error finding container 4bcabf008a050ae154234e46b2c9b03fe1b2ecb297f2e9767d542ba51f11b4b8: Status 404 returned error can't find the container with id 4bcabf008a050ae154234e46b2c9b03fe1b2ecb297f2e9767d542ba51f11b4b8 Apr 24 16:39:58.266033 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:58.265095 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc"] Apr 24 16:39:58.268491 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:39:58.268460 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0351ca2_fa39_42bc_b374_b53d08923ed5.slice/crio-71f2eb20c67569b606779c527beea72c0cfbe0411045888d2d08638b49302590 WatchSource:0}: Error finding container 71f2eb20c67569b606779c527beea72c0cfbe0411045888d2d08638b49302590: Status 404 returned error can't find the container with id 71f2eb20c67569b606779c527beea72c0cfbe0411045888d2d08638b49302590 Apr 24 16:39:58.787462 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:58.787423 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" event={"ID":"d0351ca2-fa39-42bc-b374-b53d08923ed5","Type":"ContainerStarted","Data":"71f2eb20c67569b606779c527beea72c0cfbe0411045888d2d08638b49302590"} Apr 24 16:39:58.788596 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:39:58.788563 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" event={"ID":"aaba430d-6bd1-4bba-b94f-f0ca122b7f17","Type":"ContainerStarted","Data":"4bcabf008a050ae154234e46b2c9b03fe1b2ecb297f2e9767d542ba51f11b4b8"} Apr 24 16:40:03.801788 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:03.801751 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" event={"ID":"d0351ca2-fa39-42bc-b374-b53d08923ed5","Type":"ContainerStarted","Data":"66770dff0d93297d66535e9672376cb03e107912a76671bfb85b8242488fedd1"} Apr 24 16:40:03.802200 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:03.802140 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:40:03.803498 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:03.803456 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" event={"ID":"aaba430d-6bd1-4bba-b94f-f0ca122b7f17","Type":"ContainerStarted","Data":"78c70e2459deb989e906580ce90e912ef618acc785cf83e8cec2f5f0f1303b43"} Apr 24 16:40:03.803807 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:03.803788 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:40:03.818727 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:03.818642 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" podStartSLOduration=1.875877659 podStartE2EDuration="6.818625186s" podCreationTimestamp="2026-04-24 16:39:57 +0000 UTC" firstStartedPulling="2026-04-24 16:39:58.270181898 +0000 UTC m=+51.260870693" lastFinishedPulling="2026-04-24 16:40:03.212929422 +0000 UTC m=+56.203618220" observedRunningTime="2026-04-24 16:40:03.818503822 +0000 UTC m=+56.809192642" watchObservedRunningTime="2026-04-24 16:40:03.818625186 +0000 UTC m=+56.809314007" Apr 24 16:40:05.810046 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:05.809998 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" event={"ID":"aaba430d-6bd1-4bba-b94f-f0ca122b7f17","Type":"ContainerStarted","Data":"f4611f05031a9d95149ddfc1188356a2dca2581604ef0591bd7c1bfc64eadfa7"} Apr 24 16:40:05.810046 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:05.810049 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" event={"ID":"aaba430d-6bd1-4bba-b94f-f0ca122b7f17","Type":"ContainerStarted","Data":"3cc9cb21ad1583d21be57ed66d953a48d9d1acc35795b40572146874be96e80e"} Apr 24 16:40:05.837278 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:05.837233 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" podStartSLOduration=2.064801372 podStartE2EDuration="8.837219939s" podCreationTimestamp="2026-04-24 16:39:57 +0000 UTC" firstStartedPulling="2026-04-24 16:39:58.263592286 +0000 UTC m=+51.254281096" lastFinishedPulling="2026-04-24 16:40:05.036010855 +0000 UTC m=+58.026699663" observedRunningTime="2026-04-24 16:40:05.835876327 +0000 UTC m=+58.826565144" watchObservedRunningTime="2026-04-24 16:40:05.837219939 +0000 UTC m=+58.827908757" Apr 24 16:40:06.745414 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:06.745383 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-m22zz" Apr 24 16:40:12.884096 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:12.884056 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:40:12.884555 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:12.884147 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:40:12.884555 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:12.884211 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:12.884555 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:12.884225 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:12.884555 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:12.884277 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:40:44.88426393 +0000 UTC m=+97.874952725 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:40:12.884555 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:12.884290 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:40:44.884283601 +0000 UTC m=+97.874972396 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:40:13.186318 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.186234 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:40:13.189020 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.189003 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 16:40:13.197303 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:13.197282 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:40:13.197395 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:13.197344 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:41:17.197323094 +0000 UTC m=+130.188011899 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : secret "metrics-daemon-secret" not found Apr 24 16:40:13.286734 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.286702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:40:13.286863 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.286755 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:40:13.289396 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.289381 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 16:40:13.289530 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.289512 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 16:40:13.299826 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.299791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 16:40:13.300009 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.299990 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a49b4d14-b188-40e4-828a-7109543078dc-original-pull-secret\") pod \"global-pull-secret-syncer-pqspn\" (UID: \"a49b4d14-b188-40e4-828a-7109543078dc\") " pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:40:13.310256 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.310235 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-822vr\" (UniqueName: \"kubernetes.io/projected/2c1005ed-6b92-40fd-a607-3082a407e5c8-kube-api-access-822vr\") pod \"network-check-target-2sm9w\" (UID: \"2c1005ed-6b92-40fd-a607-3082a407e5c8\") " pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:40:13.413706 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.413671 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pqspn" Apr 24 16:40:13.427512 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.427490 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-v7jl2\"" Apr 24 16:40:13.435754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.435719 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:40:13.538092 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.538059 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pqspn"] Apr 24 16:40:13.541440 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:40:13.541410 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda49b4d14_b188_40e4_828a_7109543078dc.slice/crio-aaf41ed621095d1cbee1e2fd33396af52592dfab604a917f2a4aa95208ab6557 WatchSource:0}: Error finding container aaf41ed621095d1cbee1e2fd33396af52592dfab604a917f2a4aa95208ab6557: Status 404 returned error can't find the container with id aaf41ed621095d1cbee1e2fd33396af52592dfab604a917f2a4aa95208ab6557 Apr 24 16:40:13.561204 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.561178 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2sm9w"] Apr 24 16:40:13.564004 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:40:13.563972 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1005ed_6b92_40fd_a607_3082a407e5c8.slice/crio-dc5af07f23d7aa9aa15812e37c9653335e17bb5d98889a907979695c4f5447a3 WatchSource:0}: Error finding container dc5af07f23d7aa9aa15812e37c9653335e17bb5d98889a907979695c4f5447a3: Status 404 returned error can't find the container with id dc5af07f23d7aa9aa15812e37c9653335e17bb5d98889a907979695c4f5447a3 Apr 24 16:40:13.825750 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.825712 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2sm9w" event={"ID":"2c1005ed-6b92-40fd-a607-3082a407e5c8","Type":"ContainerStarted","Data":"dc5af07f23d7aa9aa15812e37c9653335e17bb5d98889a907979695c4f5447a3"} Apr 24 16:40:13.826576 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:13.826555 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pqspn" event={"ID":"a49b4d14-b188-40e4-828a-7109543078dc","Type":"ContainerStarted","Data":"aaf41ed621095d1cbee1e2fd33396af52592dfab604a917f2a4aa95208ab6557"} Apr 24 16:40:18.841028 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:18.840990 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2sm9w" event={"ID":"2c1005ed-6b92-40fd-a607-3082a407e5c8","Type":"ContainerStarted","Data":"9f40fe302174be903bdb20c2e3f2090d27e1130eb41bb3f043b08449bb1b465f"} Apr 24 16:40:18.841501 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:18.841085 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:40:18.842276 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:18.842250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pqspn" event={"ID":"a49b4d14-b188-40e4-828a-7109543078dc","Type":"ContainerStarted","Data":"d40c5c7a17f960f0aeb1940417770618c97238d016c150ad63c61c853ef86349"} Apr 24 16:40:18.858464 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:18.858428 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2sm9w" podStartSLOduration=67.612478272 podStartE2EDuration="1m11.858417453s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:40:13.565661591 +0000 UTC m=+66.556350385" lastFinishedPulling="2026-04-24 16:40:17.81160077 +0000 UTC m=+70.802289566" observedRunningTime="2026-04-24 16:40:18.857270923 +0000 UTC m=+71.847959739" watchObservedRunningTime="2026-04-24 16:40:18.858417453 +0000 UTC m=+71.849106269" Apr 24 16:40:18.879704 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:18.879667 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pqspn" podStartSLOduration=66.606591246 podStartE2EDuration="1m10.879657456s" podCreationTimestamp="2026-04-24 16:39:08 +0000 UTC" firstStartedPulling="2026-04-24 16:40:13.542893328 +0000 UTC m=+66.533582123" lastFinishedPulling="2026-04-24 16:40:17.815959536 +0000 UTC m=+70.806648333" observedRunningTime="2026-04-24 16:40:18.878684967 +0000 UTC m=+71.869373784" watchObservedRunningTime="2026-04-24 16:40:18.879657456 +0000 UTC m=+71.870346273" Apr 24 16:40:44.917858 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:44.917826 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:40:44.918316 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:44.917883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:40:44.918316 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:44.917976 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:40:44.918316 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:44.917984 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:40:44.918316 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:44.918040 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:41:48.918023087 +0000 UTC m=+161.908711883 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:40:44.918316 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:40:44.918055 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:41:48.918048666 +0000 UTC m=+161.908737461 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:40:49.847210 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:40:49.847165 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2sm9w" Apr 24 16:41:17.251379 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:17.251327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:41:17.251912 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:17.251468 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 16:41:17.251912 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:17.251540 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs podName:f8b7b6cb-c76c-42e3-9193-9423bbd58047 nodeName:}" failed. No retries permitted until 2026-04-24 16:43:19.25152291 +0000 UTC m=+252.242211709 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs") pod "network-metrics-daemon-tgkjm" (UID: "f8b7b6cb-c76c-42e3-9193-9423bbd58047") : secret "metrics-daemon-secret" not found Apr 24 16:41:36.475878 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.475840 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-848c5fcd6b-bpxr5"] Apr 24 16:41:36.478812 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.478792 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.529229 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.529203 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 16:41:36.530694 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.530678 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 16:41:36.531192 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.531175 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-l6l2l\"" Apr 24 16:41:36.532039 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.532027 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 16:41:36.585763 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.585736 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-848c5fcd6b-bpxr5"] Apr 24 16:41:36.590006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.589984 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-image-registry-private-configuration\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.590147 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.590015 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a700cfda-aa55-4939-9d44-8aabc257f1bb-ca-trust-extracted\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.590147 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.590047 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-installation-pull-secrets\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.590147 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.590072 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-trusted-ca\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.590266 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.590162 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.590266 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.590190 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-bound-sa-token\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.590266 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.590208 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-certificates\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.590266 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.590224 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrc9z\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-kube-api-access-nrc9z\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.591225 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.591175 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 16:41:36.690977 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.690943 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.690977 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.690979 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-bound-sa-token\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691204 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-certificates\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691204 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691029 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nrc9z\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-kube-api-access-nrc9z\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691204 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:36.691091 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:36.691204 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:36.691128 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-848c5fcd6b-bpxr5: secret "image-registry-tls" not found Apr 24 16:41:36.691204 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:36.691195 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls podName:a700cfda-aa55-4939-9d44-8aabc257f1bb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:37.191174979 +0000 UTC m=+150.181863775 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls") pod "image-registry-848c5fcd6b-bpxr5" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb") : secret "image-registry-tls" not found Apr 24 16:41:36.691494 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691225 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-image-registry-private-configuration\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691494 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691265 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a700cfda-aa55-4939-9d44-8aabc257f1bb-ca-trust-extracted\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691494 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691319 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-installation-pull-secrets\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691494 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691340 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-trusted-ca\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a700cfda-aa55-4939-9d44-8aabc257f1bb-ca-trust-extracted\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.691771 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.691750 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-certificates\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.692158 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.692137 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-trusted-ca\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.693697 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.693679 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-image-registry-private-configuration\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.693760 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.693731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-installation-pull-secrets\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.700211 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.700192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-bound-sa-token\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:36.701263 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:36.701247 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrc9z\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-kube-api-access-nrc9z\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:37.194416 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:37.194380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:37.194581 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:37.194537 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:37.194581 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:37.194555 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-848c5fcd6b-bpxr5: secret "image-registry-tls" not found Apr 24 16:41:37.194653 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:37.194623 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls podName:a700cfda-aa55-4939-9d44-8aabc257f1bb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:38.194604008 +0000 UTC m=+151.185292822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls") pod "image-registry-848c5fcd6b-bpxr5" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb") : secret "image-registry-tls" not found Apr 24 16:41:38.202403 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:38.202371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:38.202749 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:38.202518 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:38.202749 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:38.202536 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-848c5fcd6b-bpxr5: secret "image-registry-tls" not found Apr 24 16:41:38.202749 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:38.202593 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls podName:a700cfda-aa55-4939-9d44-8aabc257f1bb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:40.202578851 +0000 UTC m=+153.193267647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls") pod "image-registry-848c5fcd6b-bpxr5" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb") : secret "image-registry-tls" not found Apr 24 16:41:40.217024 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:40.216982 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:40.217492 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:40.217089 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:40.217492 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:40.217100 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-848c5fcd6b-bpxr5: secret "image-registry-tls" not found Apr 24 16:41:40.217492 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:40.217163 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls podName:a700cfda-aa55-4939-9d44-8aabc257f1bb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:44.217150351 +0000 UTC m=+157.207839147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls") pod "image-registry-848c5fcd6b-bpxr5" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb") : secret "image-registry-tls" not found Apr 24 16:41:42.259295 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.259254 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc"] Apr 24 16:41:42.262572 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.262556 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:42.265186 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.265163 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 16:41:42.265493 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.265475 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 16:41:42.265620 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.265604 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-5lhtf\"" Apr 24 16:41:42.274508 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.274488 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc"] Apr 24 16:41:42.331061 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.331031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:42.331205 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.331128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/796ab223-e545-471e-9868-71174bdad1bf-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:42.431959 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.431932 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:42.432215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.432003 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/796ab223-e545-471e-9868-71174bdad1bf-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:42.432215 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:42.432074 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:41:42.432215 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:42.432188 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert podName:796ab223-e545-471e-9868-71174bdad1bf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:42.932168138 +0000 UTC m=+155.922856933 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qsbsc" (UID: "796ab223-e545-471e-9868-71174bdad1bf") : secret "networking-console-plugin-cert" not found Apr 24 16:41:42.433142 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.433125 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/796ab223-e545-471e-9868-71174bdad1bf-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:42.936294 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.936253 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:42.936461 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:42.936394 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:41:42.936461 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:42.936460 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert podName:796ab223-e545-471e-9868-71174bdad1bf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:43.936445824 +0000 UTC m=+156.927134627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qsbsc" (UID: "796ab223-e545-471e-9868-71174bdad1bf") : secret "networking-console-plugin-cert" not found Apr 24 16:41:42.970714 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:42.970691 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-49ml9_7f2523fe-21a3-46f7-a03b-88e7ae991338/dns-node-resolver/0.log" Apr 24 16:41:43.770963 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:43.770936 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vvwg9_f09d386a-3466-46d1-a1d1-efb87cc77eba/node-ca/0.log" Apr 24 16:41:43.944536 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:43.944501 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:43.944713 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:43.944627 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:41:43.944776 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:43.944721 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert podName:796ab223-e545-471e-9868-71174bdad1bf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:45.944701243 +0000 UTC m=+158.935390042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qsbsc" (UID: "796ab223-e545-471e-9868-71174bdad1bf") : secret "networking-console-plugin-cert" not found Apr 24 16:41:44.064023 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:44.063948 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-b68bd" podUID="25e3ae92-c693-4c50-b5ce-0ed6ad115edd" Apr 24 16:41:44.079234 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:44.079204 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-5nfqq" podUID="b0e3d259-e5e4-4160-8258-8d97913d476a" Apr 24 16:41:44.245889 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:44.245856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:44.246019 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:44.246000 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 16:41:44.246075 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:44.246021 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-848c5fcd6b-bpxr5: secret "image-registry-tls" not found Apr 24 16:41:44.246075 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:44.246074 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls podName:a700cfda-aa55-4939-9d44-8aabc257f1bb nodeName:}" failed. No retries permitted until 2026-04-24 16:41:52.246058842 +0000 UTC m=+165.236747641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls") pod "image-registry-848c5fcd6b-bpxr5" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb") : secret "image-registry-tls" not found Apr 24 16:41:45.038356 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:45.038327 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b68bd" Apr 24 16:41:45.519685 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:45.519643 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-tgkjm" podUID="f8b7b6cb-c76c-42e3-9193-9423bbd58047" Apr 24 16:41:45.958709 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:45.958676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:45.958859 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:45.958825 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:41:45.958901 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:45.958891 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert podName:796ab223-e545-471e-9868-71174bdad1bf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:49.95887451 +0000 UTC m=+162.949563305 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qsbsc" (UID: "796ab223-e545-471e-9868-71174bdad1bf") : secret "networking-console-plugin-cert" not found Apr 24 16:41:48.978215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:48.978168 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:41:48.978813 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:48.978241 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:41:48.978813 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:48.978337 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 16:41:48.978813 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:48.978336 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 16:41:48.978813 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:48.978400 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls podName:25e3ae92-c693-4c50-b5ce-0ed6ad115edd nodeName:}" failed. No retries permitted until 2026-04-24 16:43:50.978387254 +0000 UTC m=+283.969076050 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls") pod "dns-default-b68bd" (UID: "25e3ae92-c693-4c50-b5ce-0ed6ad115edd") : secret "dns-default-metrics-tls" not found Apr 24 16:41:48.978813 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:48.978414 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert podName:b0e3d259-e5e4-4160-8258-8d97913d476a nodeName:}" failed. No retries permitted until 2026-04-24 16:43:50.97840797 +0000 UTC m=+283.969096765 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert") pod "ingress-canary-5nfqq" (UID: "b0e3d259-e5e4-4160-8258-8d97913d476a") : secret "canary-serving-cert" not found Apr 24 16:41:49.986219 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:49.986188 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:49.986612 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:49.986281 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 16:41:49.986612 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:41:49.986333 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert podName:796ab223-e545-471e-9868-71174bdad1bf nodeName:}" failed. No retries permitted until 2026-04-24 16:41:57.986320271 +0000 UTC m=+170.977009066 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-qsbsc" (UID: "796ab223-e545-471e-9868-71174bdad1bf") : secret "networking-console-plugin-cert" not found Apr 24 16:41:52.305047 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:52.305015 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:52.307775 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:52.307737 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"image-registry-848c5fcd6b-bpxr5\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:52.387007 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:52.386967 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:52.508770 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:52.508741 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-848c5fcd6b-bpxr5"] Apr 24 16:41:52.514185 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:41:52.514163 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda700cfda_aa55_4939_9d44_8aabc257f1bb.slice/crio-c20288027e5b35a403c7fff89f48e2ee085a2ad13a2fe7d79e2695a1722c1697 WatchSource:0}: Error finding container c20288027e5b35a403c7fff89f48e2ee085a2ad13a2fe7d79e2695a1722c1697: Status 404 returned error can't find the container with id c20288027e5b35a403c7fff89f48e2ee085a2ad13a2fe7d79e2695a1722c1697 Apr 24 16:41:53.058543 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:53.058508 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" event={"ID":"a700cfda-aa55-4939-9d44-8aabc257f1bb","Type":"ContainerStarted","Data":"fbda478aa3030e4a256723e7ab80f5a6d4bcf50e41c07c673d36d0d731145d4c"} Apr 24 16:41:53.058543 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:53.058550 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" event={"ID":"a700cfda-aa55-4939-9d44-8aabc257f1bb","Type":"ContainerStarted","Data":"c20288027e5b35a403c7fff89f48e2ee085a2ad13a2fe7d79e2695a1722c1697"} Apr 24 16:41:53.058738 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:53.058633 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:41:53.088261 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:53.088215 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" podStartSLOduration=17.088200541 podStartE2EDuration="17.088200541s" podCreationTimestamp="2026-04-24 16:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:41:53.087430072 +0000 UTC m=+166.078118886" watchObservedRunningTime="2026-04-24 16:41:53.088200541 +0000 UTC m=+166.078889358" Apr 24 16:41:54.504555 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:54.504527 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:41:58.048466 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:58.048376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:58.050788 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:58.050767 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/796ab223-e545-471e-9868-71174bdad1bf-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-qsbsc\" (UID: \"796ab223-e545-471e-9868-71174bdad1bf\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:58.171735 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:58.171677 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" Apr 24 16:41:58.286203 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:58.286163 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc"] Apr 24 16:41:58.289811 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:41:58.289781 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod796ab223_e545_471e_9868_71174bdad1bf.slice/crio-c5e2b3d992e10de2f9d50336d84ded23890f5e50f8894a7c6ee10f796926983e WatchSource:0}: Error finding container c5e2b3d992e10de2f9d50336d84ded23890f5e50f8894a7c6ee10f796926983e: Status 404 returned error can't find the container with id c5e2b3d992e10de2f9d50336d84ded23890f5e50f8894a7c6ee10f796926983e Apr 24 16:41:58.504954 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:58.504917 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:41:59.073747 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:41:59.073709 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" event={"ID":"796ab223-e545-471e-9868-71174bdad1bf","Type":"ContainerStarted","Data":"c5e2b3d992e10de2f9d50336d84ded23890f5e50f8894a7c6ee10f796926983e"} Apr 24 16:42:00.077877 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:00.077839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" event={"ID":"796ab223-e545-471e-9868-71174bdad1bf","Type":"ContainerStarted","Data":"637eab97bbde0143f758d1fa8d3a07da77ec3d14b35fb6e32a114ed07f836cad"} Apr 24 16:42:00.094409 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:00.094366 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-qsbsc" podStartSLOduration=17.134966346 podStartE2EDuration="18.094350938s" podCreationTimestamp="2026-04-24 16:41:42 +0000 UTC" firstStartedPulling="2026-04-24 16:41:58.291546348 +0000 UTC m=+171.282235143" lastFinishedPulling="2026-04-24 16:41:59.250930939 +0000 UTC m=+172.241619735" observedRunningTime="2026-04-24 16:42:00.094128453 +0000 UTC m=+173.084817265" watchObservedRunningTime="2026-04-24 16:42:00.094350938 +0000 UTC m=+173.085039755" Apr 24 16:42:03.802935 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:03.802879 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" podUID="d0351ca2-fa39-42bc-b374-b53d08923ed5" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.8:8000/readyz\": dial tcp 10.132.0.8:8000: connect: connection refused" Apr 24 16:42:04.090553 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:04.090464 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0351ca2-fa39-42bc-b374-b53d08923ed5" containerID="66770dff0d93297d66535e9672376cb03e107912a76671bfb85b8242488fedd1" exitCode=1 Apr 24 16:42:04.090687 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:04.090542 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" event={"ID":"d0351ca2-fa39-42bc-b374-b53d08923ed5","Type":"ContainerDied","Data":"66770dff0d93297d66535e9672376cb03e107912a76671bfb85b8242488fedd1"} Apr 24 16:42:04.090928 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:04.090914 2575 scope.go:117] "RemoveContainer" containerID="66770dff0d93297d66535e9672376cb03e107912a76671bfb85b8242488fedd1" Apr 24 16:42:05.094932 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.094897 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" event={"ID":"d0351ca2-fa39-42bc-b374-b53d08923ed5","Type":"ContainerStarted","Data":"f368d829c1e1e14e30cb97f8c3f4f13db078b4e8048414daf2acc370134f0fa2"} Apr 24 16:42:05.095318 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.095154 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:42:05.095835 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.095806 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:42:05.267975 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.267948 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-9sq68"] Apr 24 16:42:05.271043 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.271023 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.276268 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.276210 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 16:42:05.276415 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.276383 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zvkrd\"" Apr 24 16:42:05.276485 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.276408 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 16:42:05.276485 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.276466 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 16:42:05.277383 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.277286 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 16:42:05.282695 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.282674 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9sq68"] Apr 24 16:42:05.321729 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.321698 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-zw6kr"] Apr 24 16:42:05.324619 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.324599 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zw6kr" Apr 24 16:42:05.327227 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.327190 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 24 16:42:05.327336 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.327282 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 24 16:42:05.327398 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.327331 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-df2qz\"" Apr 24 16:42:05.330059 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.330040 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-848c5fcd6b-bpxr5"] Apr 24 16:42:05.338004 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.337980 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zw6kr"] Apr 24 16:42:05.365361 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.365307 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b969cbff9-f4cgw"] Apr 24 16:42:05.368168 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.368154 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.380237 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.380219 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b969cbff9-f4cgw"] Apr 24 16:42:05.403980 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.403944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrltj\" (UniqueName: \"kubernetes.io/projected/e75cf70a-707a-43de-bb15-bdf89460c4ca-kube-api-access-qrltj\") pod \"downloads-6bcc868b7-zw6kr\" (UID: \"e75cf70a-707a-43de-bb15-bdf89460c4ca\") " pod="openshift-console/downloads-6bcc868b7-zw6kr" Apr 24 16:42:05.404127 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.403986 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f900db-1f85-4389-9d11-55baa30ef7c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.404127 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.404031 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/01f900db-1f85-4389-9d11-55baa30ef7c7-crio-socket\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.404234 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.404125 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hntxz\" (UniqueName: \"kubernetes.io/projected/01f900db-1f85-4389-9d11-55baa30ef7c7-kube-api-access-hntxz\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.404234 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.404164 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/01f900db-1f85-4389-9d11-55baa30ef7c7-data-volume\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.404234 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.404189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/01f900db-1f85-4389-9d11-55baa30ef7c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.504878 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.504849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-ca-trust-extracted\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.504888 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrltj\" (UniqueName: \"kubernetes.io/projected/e75cf70a-707a-43de-bb15-bdf89460c4ca-kube-api-access-qrltj\") pod \"downloads-6bcc868b7-zw6kr\" (UID: \"e75cf70a-707a-43de-bb15-bdf89460c4ca\") " pod="openshift-console/downloads-6bcc868b7-zw6kr" Apr 24 16:42:05.505006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.504908 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-image-registry-private-configuration\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.504926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-bound-sa-token\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.504974 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f900db-1f85-4389-9d11-55baa30ef7c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.505006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.504991 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-installation-pull-secrets\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505022 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-registry-tls\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505069 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/01f900db-1f85-4389-9d11-55baa30ef7c7-crio-socket\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.505260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505153 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/01f900db-1f85-4389-9d11-55baa30ef7c7-crio-socket\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.505260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505167 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-registry-certificates\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-trusted-ca\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505237 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z952\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-kube-api-access-8z952\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.505534 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hntxz\" (UniqueName: \"kubernetes.io/projected/01f900db-1f85-4389-9d11-55baa30ef7c7-kube-api-access-hntxz\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.505534 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505311 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/01f900db-1f85-4389-9d11-55baa30ef7c7-data-volume\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.505534 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505336 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/01f900db-1f85-4389-9d11-55baa30ef7c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.505686 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505590 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/01f900db-1f85-4389-9d11-55baa30ef7c7-data-volume\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.505876 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.505856 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/01f900db-1f85-4389-9d11-55baa30ef7c7-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.507524 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.507504 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/01f900db-1f85-4389-9d11-55baa30ef7c7-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.514032 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.514014 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hntxz\" (UniqueName: \"kubernetes.io/projected/01f900db-1f85-4389-9d11-55baa30ef7c7-kube-api-access-hntxz\") pod \"insights-runtime-extractor-9sq68\" (UID: \"01f900db-1f85-4389-9d11-55baa30ef7c7\") " pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.514273 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.514257 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrltj\" (UniqueName: \"kubernetes.io/projected/e75cf70a-707a-43de-bb15-bdf89460c4ca-kube-api-access-qrltj\") pod \"downloads-6bcc868b7-zw6kr\" (UID: \"e75cf70a-707a-43de-bb15-bdf89460c4ca\") " pod="openshift-console/downloads-6bcc868b7-zw6kr" Apr 24 16:42:05.579898 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.579868 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-9sq68" Apr 24 16:42:05.606388 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606356 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-registry-certificates\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606555 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606396 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-trusted-ca\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606555 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z952\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-kube-api-access-8z952\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606555 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606493 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-ca-trust-extracted\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606555 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606548 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-image-registry-private-configuration\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606762 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606574 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-bound-sa-token\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606762 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606606 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-installation-pull-secrets\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606762 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606655 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-registry-tls\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.606989 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.606966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-ca-trust-extracted\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.607218 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.607180 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-registry-certificates\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.608121 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.608074 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-trusted-ca\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.609260 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.609239 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-registry-tls\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.609425 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.609275 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-installation-pull-secrets\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.609728 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.609712 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-image-registry-private-configuration\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.623489 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.623426 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-bound-sa-token\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.624210 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.624192 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z952\" (UniqueName: \"kubernetes.io/projected/8e30bd49-e02e-4cb9-9908-d3dd4ede132a-kube-api-access-8z952\") pod \"image-registry-b969cbff9-f4cgw\" (UID: \"8e30bd49-e02e-4cb9-9908-d3dd4ede132a\") " pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.633025 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.633000 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-zw6kr" Apr 24 16:42:05.676554 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.676525 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:05.713999 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.713875 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-9sq68"] Apr 24 16:42:05.718567 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:05.718527 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f900db_1f85_4389_9d11_55baa30ef7c7.slice/crio-b3ebf92cd15ca04c39576bb56a34dec7b8aa10b5a55fca331a5d58333101853a WatchSource:0}: Error finding container b3ebf92cd15ca04c39576bb56a34dec7b8aa10b5a55fca331a5d58333101853a: Status 404 returned error can't find the container with id b3ebf92cd15ca04c39576bb56a34dec7b8aa10b5a55fca331a5d58333101853a Apr 24 16:42:05.767792 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.767762 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-zw6kr"] Apr 24 16:42:05.773949 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:05.773915 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode75cf70a_707a_43de_bb15_bdf89460c4ca.slice/crio-75ca2b048b66db989bd1789a068eb95c2b84935da69a8e63c5c34e3b9a08be0e WatchSource:0}: Error finding container 75ca2b048b66db989bd1789a068eb95c2b84935da69a8e63c5c34e3b9a08be0e: Status 404 returned error can't find the container with id 75ca2b048b66db989bd1789a068eb95c2b84935da69a8e63c5c34e3b9a08be0e Apr 24 16:42:05.809288 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:05.809243 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b969cbff9-f4cgw"] Apr 24 16:42:05.816188 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:05.816129 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e30bd49_e02e_4cb9_9908_d3dd4ede132a.slice/crio-4b7f1ba4e0ad35a01644248c589c268460e4cb17c258cd18fb4c024678f9ce2d WatchSource:0}: Error finding container 4b7f1ba4e0ad35a01644248c589c268460e4cb17c258cd18fb4c024678f9ce2d: Status 404 returned error can't find the container with id 4b7f1ba4e0ad35a01644248c589c268460e4cb17c258cd18fb4c024678f9ce2d Apr 24 16:42:06.102544 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:06.102504 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9sq68" event={"ID":"01f900db-1f85-4389-9d11-55baa30ef7c7","Type":"ContainerStarted","Data":"73ad25b9f45c7ecad03ff6637c3a52b61f1d76eb7ec0745ef396ef3e5cbf72ee"} Apr 24 16:42:06.102971 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:06.102552 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9sq68" event={"ID":"01f900db-1f85-4389-9d11-55baa30ef7c7","Type":"ContainerStarted","Data":"b3ebf92cd15ca04c39576bb56a34dec7b8aa10b5a55fca331a5d58333101853a"} Apr 24 16:42:06.103864 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:06.103833 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" event={"ID":"8e30bd49-e02e-4cb9-9908-d3dd4ede132a","Type":"ContainerStarted","Data":"b709e703b44e91923f14815d92f635482a50ef5b687b7cb41b9124dba6c04a83"} Apr 24 16:42:06.103864 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:06.103867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" event={"ID":"8e30bd49-e02e-4cb9-9908-d3dd4ede132a","Type":"ContainerStarted","Data":"4b7f1ba4e0ad35a01644248c589c268460e4cb17c258cd18fb4c024678f9ce2d"} Apr 24 16:42:06.104053 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:06.103955 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:06.104944 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:06.104925 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zw6kr" event={"ID":"e75cf70a-707a-43de-bb15-bdf89460c4ca","Type":"ContainerStarted","Data":"75ca2b048b66db989bd1789a068eb95c2b84935da69a8e63c5c34e3b9a08be0e"} Apr 24 16:42:06.126188 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:06.126124 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podStartSLOduration=1.126089621 podStartE2EDuration="1.126089621s" podCreationTimestamp="2026-04-24 16:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:42:06.125674106 +0000 UTC m=+179.116362936" watchObservedRunningTime="2026-04-24 16:42:06.126089621 +0000 UTC m=+179.116778439" Apr 24 16:42:07.110628 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:07.110588 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9sq68" event={"ID":"01f900db-1f85-4389-9d11-55baa30ef7c7","Type":"ContainerStarted","Data":"fd29ced4227f9efd7cd9fec17cea7b7126c2f09be289e53d7b88e41ecf9e2620"} Apr 24 16:42:08.116243 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:08.116151 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-9sq68" event={"ID":"01f900db-1f85-4389-9d11-55baa30ef7c7","Type":"ContainerStarted","Data":"a5ec601d7efcb8a49fbcec12d7eb751439dd559d5f908a6727a371e3d752b821"} Apr 24 16:42:08.136655 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:08.136599 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-9sq68" podStartSLOduration=1.144097274 podStartE2EDuration="3.136583138s" podCreationTimestamp="2026-04-24 16:42:05 +0000 UTC" firstStartedPulling="2026-04-24 16:42:05.793720203 +0000 UTC m=+178.784408997" lastFinishedPulling="2026-04-24 16:42:07.786206061 +0000 UTC m=+180.776894861" observedRunningTime="2026-04-24 16:42:08.135629328 +0000 UTC m=+181.126318143" watchObservedRunningTime="2026-04-24 16:42:08.136583138 +0000 UTC m=+181.127271956" Apr 24 16:42:14.172802 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.172768 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jbmtl"] Apr 24 16:42:14.179707 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.179682 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.183529 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.183505 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 16:42:14.183529 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.183522 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 16:42:14.183529 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.183549 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 16:42:14.183814 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.183511 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 24 16:42:14.183814 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.183672 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 24 16:42:14.183814 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.183760 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-r2j58\"" Apr 24 16:42:14.187603 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.187578 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jbmtl"] Apr 24 16:42:14.282225 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.282189 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d37af086-7938-46de-bda3-6b2a30be7321-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.282392 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.282295 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d37af086-7938-46de-bda3-6b2a30be7321-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.282392 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.282330 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d37af086-7938-46de-bda3-6b2a30be7321-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.282495 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.282466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48x9w\" (UniqueName: \"kubernetes.io/projected/d37af086-7938-46de-bda3-6b2a30be7321-kube-api-access-48x9w\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.383903 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.383873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d37af086-7938-46de-bda3-6b2a30be7321-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.384066 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.383924 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d37af086-7938-46de-bda3-6b2a30be7321-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.384066 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.383997 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-48x9w\" (UniqueName: \"kubernetes.io/projected/d37af086-7938-46de-bda3-6b2a30be7321-kube-api-access-48x9w\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.384066 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.384042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d37af086-7938-46de-bda3-6b2a30be7321-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.384673 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.384647 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d37af086-7938-46de-bda3-6b2a30be7321-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.386841 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.386815 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d37af086-7938-46de-bda3-6b2a30be7321-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.386972 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.386852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d37af086-7938-46de-bda3-6b2a30be7321-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.393006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.392984 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-48x9w\" (UniqueName: \"kubernetes.io/projected/d37af086-7938-46de-bda3-6b2a30be7321-kube-api-access-48x9w\") pod \"prometheus-operator-5676c8c784-jbmtl\" (UID: \"d37af086-7938-46de-bda3-6b2a30be7321\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:14.491252 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:14.491219 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" Apr 24 16:42:15.336201 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:15.336164 2575 patch_prober.go:28] interesting pod/image-registry-848c5fcd6b-bpxr5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:15.336610 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:15.336238 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" podUID="a700cfda-aa55-4939-9d44-8aabc257f1bb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:16.535395 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.535361 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d7fbd94b-xn8fc"] Apr 24 16:42:16.540400 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.540378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.543069 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.543045 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 24 16:42:16.543069 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.543064 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 24 16:42:16.543226 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.543090 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 24 16:42:16.544294 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.544225 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 24 16:42:16.544413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.544308 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 24 16:42:16.544413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.544319 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-6wdtj\"" Apr 24 16:42:16.549143 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.549077 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 24 16:42:16.551621 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.551584 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d7fbd94b-xn8fc"] Apr 24 16:42:16.706201 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.706169 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-oauth-config\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.706201 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.706207 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-serving-cert\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.706430 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.706281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-config\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.706430 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.706308 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/0bd66365-6a23-4c9b-a037-0d6e249a056e-kube-api-access-v5fvc\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.706430 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.706389 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-trusted-ca-bundle\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.706559 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.706435 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-service-ca\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.706559 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.706471 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-oauth-serving-cert\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.807431 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.807354 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-oauth-config\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.807431 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.807398 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-serving-cert\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.807634 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.807465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-config\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.807634 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.807494 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/0bd66365-6a23-4c9b-a037-0d6e249a056e-kube-api-access-v5fvc\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.807634 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.807531 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-trusted-ca-bundle\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.807634 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.807564 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-service-ca\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.807634 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.807595 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-oauth-serving-cert\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.808482 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.808453 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-config\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.808598 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.808457 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-oauth-serving-cert\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.808598 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.808565 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-service-ca\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.808713 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.808614 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-trusted-ca-bundle\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.810299 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.810272 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-serving-cert\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.810413 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.810299 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-oauth-config\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.816967 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.816941 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/0bd66365-6a23-4c9b-a037-0d6e249a056e-kube-api-access-v5fvc\") pod \"console-6d7fbd94b-xn8fc\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:16.852136 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:16.852090 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:21.301269 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:21.301239 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-jbmtl"] Apr 24 16:42:21.314261 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:21.314235 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d7fbd94b-xn8fc"] Apr 24 16:42:21.409266 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:21.409224 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd37af086_7938_46de_bda3_6b2a30be7321.slice/crio-235b272a9f4781c7c9250b08bebe3f0a44a2a653a14ca3e89867facfda03deab WatchSource:0}: Error finding container 235b272a9f4781c7c9250b08bebe3f0a44a2a653a14ca3e89867facfda03deab: Status 404 returned error can't find the container with id 235b272a9f4781c7c9250b08bebe3f0a44a2a653a14ca3e89867facfda03deab Apr 24 16:42:21.409499 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:21.409479 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd66365_6a23_4c9b_a037_0d6e249a056e.slice/crio-a263775cf61ba6d5250aa13fd65144e2947372ebed81a2c0add59190a8dcce4b WatchSource:0}: Error finding container a263775cf61ba6d5250aa13fd65144e2947372ebed81a2c0add59190a8dcce4b: Status 404 returned error can't find the container with id a263775cf61ba6d5250aa13fd65144e2947372ebed81a2c0add59190a8dcce4b Apr 24 16:42:22.162351 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:22.162290 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d7fbd94b-xn8fc" event={"ID":"0bd66365-6a23-4c9b-a037-0d6e249a056e","Type":"ContainerStarted","Data":"a263775cf61ba6d5250aa13fd65144e2947372ebed81a2c0add59190a8dcce4b"} Apr 24 16:42:22.163978 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:22.163943 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" event={"ID":"d37af086-7938-46de-bda3-6b2a30be7321","Type":"ContainerStarted","Data":"235b272a9f4781c7c9250b08bebe3f0a44a2a653a14ca3e89867facfda03deab"} Apr 24 16:42:22.166093 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:22.166062 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-zw6kr" event={"ID":"e75cf70a-707a-43de-bb15-bdf89460c4ca","Type":"ContainerStarted","Data":"02b5b005e0c1f768f27b6fd3a87d18dc3abe718d4325dc44bfbba899e68ec949"} Apr 24 16:42:22.166453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:22.166433 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-zw6kr" Apr 24 16:42:22.181992 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:22.181960 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-zw6kr" Apr 24 16:42:22.188471 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:22.186947 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-zw6kr" podStartSLOduration=1.267850255 podStartE2EDuration="17.186931573s" podCreationTimestamp="2026-04-24 16:42:05 +0000 UTC" firstStartedPulling="2026-04-24 16:42:05.775924466 +0000 UTC m=+178.766613269" lastFinishedPulling="2026-04-24 16:42:21.695005786 +0000 UTC m=+194.685694587" observedRunningTime="2026-04-24 16:42:22.185278857 +0000 UTC m=+195.175967675" watchObservedRunningTime="2026-04-24 16:42:22.186931573 +0000 UTC m=+195.177620407" Apr 24 16:42:23.179945 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:23.179909 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" event={"ID":"d37af086-7938-46de-bda3-6b2a30be7321","Type":"ContainerStarted","Data":"a3f50ed4bf137d8ffeb5e9c3a1a04636dd2beab65b28fba50eb8faa9e496bb52"} Apr 24 16:42:24.187770 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:24.187716 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" event={"ID":"d37af086-7938-46de-bda3-6b2a30be7321","Type":"ContainerStarted","Data":"a4cde2fe0bcca26b83235bf190bba4195dd3186afe59055ba27bf98364aa3951"} Apr 24 16:42:24.205339 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:24.205278 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-jbmtl" podStartSLOduration=8.612018342 podStartE2EDuration="10.20525718s" podCreationTimestamp="2026-04-24 16:42:14 +0000 UTC" firstStartedPulling="2026-04-24 16:42:21.425768844 +0000 UTC m=+194.416457639" lastFinishedPulling="2026-04-24 16:42:23.019007674 +0000 UTC m=+196.009696477" observedRunningTime="2026-04-24 16:42:24.204093189 +0000 UTC m=+197.194782014" watchObservedRunningTime="2026-04-24 16:42:24.20525718 +0000 UTC m=+197.195945997" Apr 24 16:42:25.335391 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:25.335358 2575 patch_prober.go:28] interesting pod/image-registry-848c5fcd6b-bpxr5 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:25.335806 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:25.335414 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" podUID="a700cfda-aa55-4939-9d44-8aabc257f1bb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:25.681492 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:25.681459 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:25.681664 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:25.681513 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:26.193852 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.193812 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d7fbd94b-xn8fc" event={"ID":"0bd66365-6a23-4c9b-a037-0d6e249a056e","Type":"ContainerStarted","Data":"deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c"} Apr 24 16:42:26.231857 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.231798 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d7fbd94b-xn8fc" podStartSLOduration=6.564297927 podStartE2EDuration="10.23178067s" podCreationTimestamp="2026-04-24 16:42:16 +0000 UTC" firstStartedPulling="2026-04-24 16:42:21.425775393 +0000 UTC m=+194.416464190" lastFinishedPulling="2026-04-24 16:42:25.093258126 +0000 UTC m=+198.083946933" observedRunningTime="2026-04-24 16:42:26.231276981 +0000 UTC m=+199.221965798" watchObservedRunningTime="2026-04-24 16:42:26.23178067 +0000 UTC m=+199.222469486" Apr 24 16:42:26.528200 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.528165 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf"] Apr 24 16:42:26.554444 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.554412 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf"] Apr 24 16:42:26.554590 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.554543 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.557359 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.557330 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 24 16:42:26.557810 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.557787 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 24 16:42:26.557935 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.557831 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-mxsx2\"" Apr 24 16:42:26.562750 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.562726 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5gkcs"] Apr 24 16:42:26.582340 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.582316 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.584872 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.584847 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 16:42:26.584959 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.584913 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 16:42:26.585027 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.584977 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-fvkdr\"" Apr 24 16:42:26.585184 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.585160 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 16:42:26.594677 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594650 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1df51550-2fa9-479c-9498-a30481fa319b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.594779 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5hwm\" (UniqueName: \"kubernetes.io/projected/c769eab2-225b-4974-b9d8-355c809d940c-kube-api-access-k5hwm\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.594779 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594753 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-root\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.594877 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594806 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-tls\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.594877 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594835 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c769eab2-225b-4974-b9d8-355c809d940c-metrics-client-ca\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.594877 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594860 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-wtmp\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.594986 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594929 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-textfile\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.594986 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.594978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df51550-2fa9-479c-9498-a30481fa319b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.595047 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.595010 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.595047 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.595032 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25jv\" (UniqueName: \"kubernetes.io/projected/1df51550-2fa9-479c-9498-a30481fa319b-kube-api-access-j25jv\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.595161 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.595055 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1df51550-2fa9-479c-9498-a30481fa319b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.595161 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.595081 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-sys\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.595263 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.595184 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-accelerators-collector-config\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696551 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-root\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696603 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-tls\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696634 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c769eab2-225b-4974-b9d8-355c809d940c-metrics-client-ca\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696670 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-wtmp\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696670 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-root\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-textfile\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696737 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df51550-2fa9-479c-9498-a30481fa319b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.696963 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696963 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696806 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j25jv\" (UniqueName: \"kubernetes.io/projected/1df51550-2fa9-479c-9498-a30481fa319b-kube-api-access-j25jv\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.696963 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696818 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-wtmp\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696963 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696829 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1df51550-2fa9-479c-9498-a30481fa319b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.696963 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696883 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-sys\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.696963 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696929 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-accelerators-collector-config\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.697162 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.696965 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1df51550-2fa9-479c-9498-a30481fa319b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.697162 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.697001 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5hwm\" (UniqueName: \"kubernetes.io/projected/c769eab2-225b-4974-b9d8-355c809d940c-kube-api-access-k5hwm\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.697307 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.697288 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c769eab2-225b-4974-b9d8-355c809d940c-metrics-client-ca\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.697407 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.697383 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c769eab2-225b-4974-b9d8-355c809d940c-sys\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.697440 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.697404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1df51550-2fa9-479c-9498-a30481fa319b-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.697908 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:42:26.697881 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 16:42:26.698213 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:42:26.697952 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-tls podName:c769eab2-225b-4974-b9d8-355c809d940c nodeName:}" failed. No retries permitted until 2026-04-24 16:42:27.197932972 +0000 UTC m=+200.188621805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-tls") pod "node-exporter-5gkcs" (UID: "c769eab2-225b-4974-b9d8-355c809d940c") : secret "node-exporter-tls" not found Apr 24 16:42:26.698213 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.697964 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-textfile\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.698213 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.698156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-accelerators-collector-config\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.700441 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.700415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.700546 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.700507 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1df51550-2fa9-479c-9498-a30481fa319b-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.700625 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.700604 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1df51550-2fa9-479c-9498-a30481fa319b-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.709803 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.708789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5hwm\" (UniqueName: \"kubernetes.io/projected/c769eab2-225b-4974-b9d8-355c809d940c-kube-api-access-k5hwm\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:26.709803 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.709077 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25jv\" (UniqueName: \"kubernetes.io/projected/1df51550-2fa9-479c-9498-a30481fa319b-kube-api-access-j25jv\") pod \"openshift-state-metrics-9d44df66c-zswgf\" (UID: \"1df51550-2fa9-479c-9498-a30481fa319b\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.852784 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.852701 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:26.852784 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.852749 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:42:26.854287 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.854255 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:42:26.854415 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.854307 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:42:26.865423 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.865395 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" Apr 24 16:42:26.999836 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:26.999802 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf"] Apr 24 16:42:27.003091 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:27.003066 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1df51550_2fa9_479c_9498_a30481fa319b.slice/crio-1ae3b5ea9d4db725c4262d786dc33bd3843873bd02042fb035fda0759921b669 WatchSource:0}: Error finding container 1ae3b5ea9d4db725c4262d786dc33bd3843873bd02042fb035fda0759921b669: Status 404 returned error can't find the container with id 1ae3b5ea9d4db725c4262d786dc33bd3843873bd02042fb035fda0759921b669 Apr 24 16:42:27.115532 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:27.115411 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:27.115532 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:27.115469 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:27.197779 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:27.197743 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" event={"ID":"1df51550-2fa9-479c-9498-a30481fa319b","Type":"ContainerStarted","Data":"38fd7e307512d865e3cfa691ed666366f74c1dc04b3b50a66b86c9b612abc31f"} Apr 24 16:42:27.197871 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:27.197786 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" event={"ID":"1df51550-2fa9-479c-9498-a30481fa319b","Type":"ContainerStarted","Data":"1ae3b5ea9d4db725c4262d786dc33bd3843873bd02042fb035fda0759921b669"} Apr 24 16:42:27.202866 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:27.202844 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-tls\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:27.205402 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:27.205379 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/c769eab2-225b-4974-b9d8-355c809d940c-node-exporter-tls\") pod \"node-exporter-5gkcs\" (UID: \"c769eab2-225b-4974-b9d8-355c809d940c\") " pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:27.494209 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:27.494172 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5gkcs" Apr 24 16:42:27.503460 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:27.503429 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc769eab2_225b_4974_b9d8_355c809d940c.slice/crio-6f659aefd7ea44dd67f1b93117f426ef6353e053a9f020526bf04d580ba04441 WatchSource:0}: Error finding container 6f659aefd7ea44dd67f1b93117f426ef6353e053a9f020526bf04d580ba04441: Status 404 returned error can't find the container with id 6f659aefd7ea44dd67f1b93117f426ef6353e053a9f020526bf04d580ba04441 Apr 24 16:42:28.124428 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.124092 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" podUID="aaba430d-6bd1-4bba-b94f-f0ca122b7f17" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:42:28.202519 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.202441 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5gkcs" event={"ID":"c769eab2-225b-4974-b9d8-355c809d940c","Type":"ContainerStarted","Data":"6f659aefd7ea44dd67f1b93117f426ef6353e053a9f020526bf04d580ba04441"} Apr 24 16:42:28.206075 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.206039 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" event={"ID":"1df51550-2fa9-479c-9498-a30481fa319b","Type":"ContainerStarted","Data":"970210cff3e589405f7c741955a26818017273b645f39775e0d132a062def064"} Apr 24 16:42:28.611621 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.611586 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk"] Apr 24 16:42:28.633025 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.632996 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk"] Apr 24 16:42:28.633181 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.633157 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.635883 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.635855 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-85hu3uko6jb5o\"" Apr 24 16:42:28.636006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.635887 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 24 16:42:28.636006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.635971 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 24 16:42:28.636006 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.635988 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-7s8pb\"" Apr 24 16:42:28.636378 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.636362 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 24 16:42:28.636591 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.636573 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 24 16:42:28.636776 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.636761 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 24 16:42:28.716895 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.716861 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-tls\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.717072 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.716977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.717072 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.717016 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.717072 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.717061 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.717267 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.717120 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtk5d\" (UniqueName: \"kubernetes.io/projected/592c17e3-bd85-4c68-9861-be16b33fb3ce-kube-api-access-mtk5d\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.717267 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.717158 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-grpc-tls\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.717267 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.717185 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/592c17e3-bd85-4c68-9861-be16b33fb3ce-metrics-client-ca\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.717267 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.717206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.817847 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.817802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818023 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.817859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818023 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.817902 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818023 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.817933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mtk5d\" (UniqueName: \"kubernetes.io/projected/592c17e3-bd85-4c68-9861-be16b33fb3ce-kube-api-access-mtk5d\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818023 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.817963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-grpc-tls\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818023 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.817990 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/592c17e3-bd85-4c68-9861-be16b33fb3ce-metrics-client-ca\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818313 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.818269 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818393 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.818376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-tls\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.818726 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.818703 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/592c17e3-bd85-4c68-9861-be16b33fb3ce-metrics-client-ca\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.820898 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.820851 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.821429 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.821404 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.821880 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.821852 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.821976 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.821890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-grpc-tls\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.822219 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.822196 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-tls\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.822307 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.822271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/592c17e3-bd85-4c68-9861-be16b33fb3ce-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.833439 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.833417 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtk5d\" (UniqueName: \"kubernetes.io/projected/592c17e3-bd85-4c68-9861-be16b33fb3ce-kube-api-access-mtk5d\") pod \"thanos-querier-5d9d75b6dd-5nqtk\" (UID: \"592c17e3-bd85-4c68-9861-be16b33fb3ce\") " pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:28.944868 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:28.944781 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:29.220535 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:29.220442 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk"] Apr 24 16:42:29.223970 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:29.223928 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod592c17e3_bd85_4c68_9861_be16b33fb3ce.slice/crio-303b538ec2e1e6ff503505213275a02bf0bc7fe4250dc5c3b1669c3245b6ccf9 WatchSource:0}: Error finding container 303b538ec2e1e6ff503505213275a02bf0bc7fe4250dc5c3b1669c3245b6ccf9: Status 404 returned error can't find the container with id 303b538ec2e1e6ff503505213275a02bf0bc7fe4250dc5c3b1669c3245b6ccf9 Apr 24 16:42:30.216457 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:30.216411 2575 generic.go:358] "Generic (PLEG): container finished" podID="c769eab2-225b-4974-b9d8-355c809d940c" containerID="1935743c8aebab26c5f58946edaf89b23dfd29c1d43e80acf8da42ef3ac6e014" exitCode=0 Apr 24 16:42:30.216710 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:30.216515 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5gkcs" event={"ID":"c769eab2-225b-4974-b9d8-355c809d940c","Type":"ContainerDied","Data":"1935743c8aebab26c5f58946edaf89b23dfd29c1d43e80acf8da42ef3ac6e014"} Apr 24 16:42:30.220454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:30.220420 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" event={"ID":"1df51550-2fa9-479c-9498-a30481fa319b","Type":"ContainerStarted","Data":"fd9ca931c9ff378faa34a2cfb45e6a08b896d280c486df781c778b2c86bd59fe"} Apr 24 16:42:30.222287 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:30.222255 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" event={"ID":"592c17e3-bd85-4c68-9861-be16b33fb3ce","Type":"ContainerStarted","Data":"303b538ec2e1e6ff503505213275a02bf0bc7fe4250dc5c3b1669c3245b6ccf9"} Apr 24 16:42:30.252639 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:30.252577 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-zswgf" podStartSLOduration=2.421911156 podStartE2EDuration="4.252560396s" podCreationTimestamp="2026-04-24 16:42:26 +0000 UTC" firstStartedPulling="2026-04-24 16:42:27.237613125 +0000 UTC m=+200.228301921" lastFinishedPulling="2026-04-24 16:42:29.068262351 +0000 UTC m=+202.058951161" observedRunningTime="2026-04-24 16:42:30.251836416 +0000 UTC m=+203.242525234" watchObservedRunningTime="2026-04-24 16:42:30.252560396 +0000 UTC m=+203.243249214" Apr 24 16:42:30.351053 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:30.350976 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" podUID="a700cfda-aa55-4939-9d44-8aabc257f1bb" containerName="registry" containerID="cri-o://fbda478aa3030e4a256723e7ab80f5a6d4bcf50e41c07c673d36d0d731145d4c" gracePeriod=30 Apr 24 16:42:31.226935 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.226894 2575 generic.go:358] "Generic (PLEG): container finished" podID="a700cfda-aa55-4939-9d44-8aabc257f1bb" containerID="fbda478aa3030e4a256723e7ab80f5a6d4bcf50e41c07c673d36d0d731145d4c" exitCode=0 Apr 24 16:42:31.227488 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.226980 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" event={"ID":"a700cfda-aa55-4939-9d44-8aabc257f1bb","Type":"ContainerDied","Data":"fbda478aa3030e4a256723e7ab80f5a6d4bcf50e41c07c673d36d0d731145d4c"} Apr 24 16:42:31.229139 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.229089 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5gkcs" event={"ID":"c769eab2-225b-4974-b9d8-355c809d940c","Type":"ContainerStarted","Data":"2efceb1ecde50fcdda392f0cf6169abc54cfa6943ba15fbd3e74cb7a7cf2649d"} Apr 24 16:42:31.229241 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.229141 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5gkcs" event={"ID":"c769eab2-225b-4974-b9d8-355c809d940c","Type":"ContainerStarted","Data":"49b2083f8ea72c58f99fb6d32fee3589e5a78d56e9d9d5eae5e6b8a8ca33f3ec"} Apr 24 16:42:31.256318 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.256266 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5gkcs" podStartSLOduration=3.691827017 podStartE2EDuration="5.25625393s" podCreationTimestamp="2026-04-24 16:42:26 +0000 UTC" firstStartedPulling="2026-04-24 16:42:27.505270385 +0000 UTC m=+200.495959180" lastFinishedPulling="2026-04-24 16:42:29.069697297 +0000 UTC m=+202.060386093" observedRunningTime="2026-04-24 16:42:31.254802033 +0000 UTC m=+204.245490850" watchObservedRunningTime="2026-04-24 16:42:31.25625393 +0000 UTC m=+204.246942789" Apr 24 16:42:31.277818 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.277794 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:42:31.343556 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343526 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrc9z\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-kube-api-access-nrc9z\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.343717 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343585 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-image-registry-private-configuration\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.343717 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343612 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.343829 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343720 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-certificates\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.343829 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343803 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-bound-sa-token\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.343933 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343850 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a700cfda-aa55-4939-9d44-8aabc257f1bb-ca-trust-extracted\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.343933 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343880 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-installation-pull-secrets\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.344027 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.343956 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-trusted-ca\") pod \"a700cfda-aa55-4939-9d44-8aabc257f1bb\" (UID: \"a700cfda-aa55-4939-9d44-8aabc257f1bb\") " Apr 24 16:42:31.344146 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.344087 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:31.344366 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.344268 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-certificates\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.344594 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.344569 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:42:31.345946 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.345900 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:31.346333 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.346312 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-kube-api-access-nrc9z" (OuterVolumeSpecName: "kube-api-access-nrc9z") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "kube-api-access-nrc9z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:31.346710 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.346667 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:31.346816 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.346735 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:42:31.348073 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.348052 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:42:31.354343 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.354316 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a700cfda-aa55-4939-9d44-8aabc257f1bb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a700cfda-aa55-4939-9d44-8aabc257f1bb" (UID: "a700cfda-aa55-4939-9d44-8aabc257f1bb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 16:42:31.400764 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.400725 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7d4b845d79-db8xv"] Apr 24 16:42:31.401176 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.401155 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a700cfda-aa55-4939-9d44-8aabc257f1bb" containerName="registry" Apr 24 16:42:31.401176 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.401174 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="a700cfda-aa55-4939-9d44-8aabc257f1bb" containerName="registry" Apr 24 16:42:31.401318 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.401228 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="a700cfda-aa55-4939-9d44-8aabc257f1bb" containerName="registry" Apr 24 16:42:31.423239 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.423216 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d4b845d79-db8xv"] Apr 24 16:42:31.423369 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.423353 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445182 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445152 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-oauth-serving-cert\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445290 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445200 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-serving-cert\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445290 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-oauth-config\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445290 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445264 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-trusted-ca-bundle\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445431 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445371 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-console-config\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445431 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445396 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cstxj\" (UniqueName: \"kubernetes.io/projected/91eefffc-7351-4ac1-802a-4880b5632d16-kube-api-access-cstxj\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445513 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445429 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-service-ca\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.445513 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445495 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a700cfda-aa55-4939-9d44-8aabc257f1bb-trusted-ca\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.445513 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445506 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nrc9z\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-kube-api-access-nrc9z\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.445645 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445515 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-image-registry-private-configuration\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.445645 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445524 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-registry-tls\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.445645 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445533 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a700cfda-aa55-4939-9d44-8aabc257f1bb-bound-sa-token\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.445645 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445545 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a700cfda-aa55-4939-9d44-8aabc257f1bb-ca-trust-extracted\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.445645 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.445557 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a700cfda-aa55-4939-9d44-8aabc257f1bb-installation-pull-secrets\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:42:31.545952 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.545922 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-console-config\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546072 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.545956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cstxj\" (UniqueName: \"kubernetes.io/projected/91eefffc-7351-4ac1-802a-4880b5632d16-kube-api-access-cstxj\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546072 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546004 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-service-ca\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546072 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546065 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-oauth-serving-cert\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546247 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546090 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-serving-cert\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546247 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546142 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-oauth-config\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546247 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546174 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-trusted-ca-bundle\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546602 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546571 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-console-config\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546678 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546631 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-service-ca\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.546990 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.546966 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-trusted-ca-bundle\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.547161 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.547135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-oauth-serving-cert\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.548841 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.548821 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-serving-cert\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.549127 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.549086 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-oauth-config\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.554834 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.554807 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cstxj\" (UniqueName: \"kubernetes.io/projected/91eefffc-7351-4ac1-802a-4880b5632d16-kube-api-access-cstxj\") pod \"console-7d4b845d79-db8xv\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.734292 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.734253 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:31.912612 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:31.911977 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7d4b845d79-db8xv"] Apr 24 16:42:31.916963 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:42:31.916930 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91eefffc_7351_4ac1_802a_4880b5632d16.slice/crio-cd48fc920711ec79e83a146abdbd2e692db48a9947eb8312f683e38338475b5c WatchSource:0}: Error finding container cd48fc920711ec79e83a146abdbd2e692db48a9947eb8312f683e38338475b5c: Status 404 returned error can't find the container with id cd48fc920711ec79e83a146abdbd2e692db48a9947eb8312f683e38338475b5c Apr 24 16:42:32.233315 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.233284 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" event={"ID":"592c17e3-bd85-4c68-9861-be16b33fb3ce","Type":"ContainerStarted","Data":"9d7fdf8d8786f6c785baa8db0ce2775b327298f0d5f52bfccbf744045ab7e24f"} Apr 24 16:42:32.233679 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.233324 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" event={"ID":"592c17e3-bd85-4c68-9861-be16b33fb3ce","Type":"ContainerStarted","Data":"d77f6b5230b8c09e581839619d94e56aa0d73c88cf8c01c28531a4d5f5c1ba1d"} Apr 24 16:42:32.233679 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.233337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" event={"ID":"592c17e3-bd85-4c68-9861-be16b33fb3ce","Type":"ContainerStarted","Data":"0a994243c8011f8f1753652cdd50930dc66d68fb75d17f5306f44c065f75fc25"} Apr 24 16:42:32.238981 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.238952 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4b845d79-db8xv" event={"ID":"91eefffc-7351-4ac1-802a-4880b5632d16","Type":"ContainerStarted","Data":"8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79"} Apr 24 16:42:32.239091 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.238988 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4b845d79-db8xv" event={"ID":"91eefffc-7351-4ac1-802a-4880b5632d16","Type":"ContainerStarted","Data":"cd48fc920711ec79e83a146abdbd2e692db48a9947eb8312f683e38338475b5c"} Apr 24 16:42:32.240230 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.240204 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" event={"ID":"a700cfda-aa55-4939-9d44-8aabc257f1bb","Type":"ContainerDied","Data":"c20288027e5b35a403c7fff89f48e2ee085a2ad13a2fe7d79e2695a1722c1697"} Apr 24 16:42:32.240327 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.240206 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-848c5fcd6b-bpxr5" Apr 24 16:42:32.240327 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.240241 2575 scope.go:117] "RemoveContainer" containerID="fbda478aa3030e4a256723e7ab80f5a6d4bcf50e41c07c673d36d0d731145d4c" Apr 24 16:42:32.257366 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.257323 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7d4b845d79-db8xv" podStartSLOduration=1.257307175 podStartE2EDuration="1.257307175s" podCreationTimestamp="2026-04-24 16:42:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:42:32.257259606 +0000 UTC m=+205.247948424" watchObservedRunningTime="2026-04-24 16:42:32.257307175 +0000 UTC m=+205.247995993" Apr 24 16:42:32.273989 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.273968 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-848c5fcd6b-bpxr5"] Apr 24 16:42:32.278991 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:32.278971 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-848c5fcd6b-bpxr5"] Apr 24 16:42:33.509902 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:33.509873 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a700cfda-aa55-4939-9d44-8aabc257f1bb" path="/var/lib/kubelet/pods/a700cfda-aa55-4939-9d44-8aabc257f1bb/volumes" Apr 24 16:42:34.248768 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:34.248731 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" event={"ID":"592c17e3-bd85-4c68-9861-be16b33fb3ce","Type":"ContainerStarted","Data":"bde1788e4e18ad25a203496b98a12762e5ac61a1442b5630614ff9bd3354137d"} Apr 24 16:42:34.248768 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:34.248769 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" event={"ID":"592c17e3-bd85-4c68-9861-be16b33fb3ce","Type":"ContainerStarted","Data":"e897936598d8fe807eef89579ebe508ceaf5acc8a12b7530105872dc45f614d0"} Apr 24 16:42:34.248967 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:34.248778 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" event={"ID":"592c17e3-bd85-4c68-9861-be16b33fb3ce","Type":"ContainerStarted","Data":"bc3b3a50ce5c0c7be9d4e4f31cab93bb7d2f9a542bfe08cc00e652245b375939"} Apr 24 16:42:34.248967 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:34.248888 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:34.275967 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:34.275917 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" podStartSLOduration=2.054836088 podStartE2EDuration="6.275901696s" podCreationTimestamp="2026-04-24 16:42:28 +0000 UTC" firstStartedPulling="2026-04-24 16:42:29.226534149 +0000 UTC m=+202.217222944" lastFinishedPulling="2026-04-24 16:42:33.447599754 +0000 UTC m=+206.438288552" observedRunningTime="2026-04-24 16:42:34.274645589 +0000 UTC m=+207.265334404" watchObservedRunningTime="2026-04-24 16:42:34.275901696 +0000 UTC m=+207.266590515" Apr 24 16:42:35.681211 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:35.681173 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:35.681557 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:35.681238 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:36.853485 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:36.853450 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:42:36.853910 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:36.853516 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:42:37.114755 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:37.114667 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:37.114755 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:37.114717 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:38.123448 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:38.123412 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" podUID="aaba430d-6bd1-4bba-b94f-f0ca122b7f17" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:42:40.259737 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:40.259711 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-5d9d75b6dd-5nqtk" Apr 24 16:42:41.734918 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:41.734881 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:41.734918 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:41.734927 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:42:41.735866 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:41.735845 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:42:41.735924 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:41.735904 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:42:45.680726 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:45.680686 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:45.681095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:45.680736 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:45.681095 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:45.680774 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:42:45.681346 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:45.681303 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"b709e703b44e91923f14815d92f635482a50ef5b687b7cb41b9124dba6c04a83"} pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" containerMessage="Container registry failed liveness probe, will be restarted" Apr 24 16:42:45.684752 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:45.684728 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:45.684869 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:45.684764 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:46.853020 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:46.852982 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:42:46.853415 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:46.853041 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:42:48.123894 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:48.123852 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" podUID="aaba430d-6bd1-4bba-b94f-f0ca122b7f17" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 16:42:48.124273 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:48.123921 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" Apr 24 16:42:48.124405 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:48.124388 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"f4611f05031a9d95149ddfc1188356a2dca2581604ef0591bd7c1bfc64eadfa7"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 16:42:48.124447 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:48.124427 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" podUID="aaba430d-6bd1-4bba-b94f-f0ca122b7f17" containerName="service-proxy" containerID="cri-o://f4611f05031a9d95149ddfc1188356a2dca2581604ef0591bd7c1bfc64eadfa7" gracePeriod=30 Apr 24 16:42:48.291317 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:48.291289 2575 generic.go:358] "Generic (PLEG): container finished" podID="aaba430d-6bd1-4bba-b94f-f0ca122b7f17" containerID="f4611f05031a9d95149ddfc1188356a2dca2581604ef0591bd7c1bfc64eadfa7" exitCode=2 Apr 24 16:42:48.291417 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:48.291353 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" event={"ID":"aaba430d-6bd1-4bba-b94f-f0ca122b7f17","Type":"ContainerDied","Data":"f4611f05031a9d95149ddfc1188356a2dca2581604ef0591bd7c1bfc64eadfa7"} Apr 24 16:42:49.296302 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:49.296264 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-6f7f895d88-lbst9" event={"ID":"aaba430d-6bd1-4bba-b94f-f0ca122b7f17","Type":"ContainerStarted","Data":"13ae0342bd326a40fb5dabc999a3133597f96aec7cb42c19bfd1e3af4f0d7fdb"} Apr 24 16:42:51.735047 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:51.735011 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:42:51.735469 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:51.735065 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:42:55.685235 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:55.685190 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:42:55.685598 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:55.685243 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:42:56.853234 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:56.853191 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:42:56.853753 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:42:56.853247 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:43:01.735633 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:01.735596 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:43:01.736009 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:01.735665 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:43:05.684627 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:05.684600 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:05.684978 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:05.684647 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:43:06.853143 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:06.853081 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:43:06.853520 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:06.853152 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:43:07.389215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:07.389163 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5gkcs_c769eab2-225b-4974-b9d8-355c809d940c/init-textfile/0.log" Apr 24 16:43:07.590341 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:07.590313 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5gkcs_c769eab2-225b-4974-b9d8-355c809d940c/node-exporter/0.log" Apr 24 16:43:07.795898 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:07.795858 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5gkcs_c769eab2-225b-4974-b9d8-355c809d940c/kube-rbac-proxy/0.log" Apr 24 16:43:08.590855 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:08.590824 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zswgf_1df51550-2fa9-479c-9498-a30481fa319b/kube-rbac-proxy-main/0.log" Apr 24 16:43:08.789982 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:08.789950 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zswgf_1df51550-2fa9-479c-9498-a30481fa319b/kube-rbac-proxy-self/0.log" Apr 24 16:43:08.989598 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:08.989569 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zswgf_1df51550-2fa9-479c-9498-a30481fa319b/openshift-state-metrics/0.log" Apr 24 16:43:10.591909 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:10.591879 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jbmtl_d37af086-7938-46de-bda3-6b2a30be7321/prometheus-operator/0.log" Apr 24 16:43:10.699691 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:10.699657 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" containerID="cri-o://b709e703b44e91923f14815d92f635482a50ef5b687b7cb41b9124dba6c04a83" gracePeriod=30 Apr 24 16:43:10.790404 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:10.790375 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jbmtl_d37af086-7938-46de-bda3-6b2a30be7321/kube-rbac-proxy/0.log" Apr 24 16:43:11.189347 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:11.189318 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/thanos-query/0.log" Apr 24 16:43:11.389227 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:11.389199 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy-web/0.log" Apr 24 16:43:11.589402 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:11.589369 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy/0.log" Apr 24 16:43:11.735117 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:11.735073 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:43:11.735454 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:11.735136 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:43:11.788953 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:11.788930 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/prom-label-proxy/0.log" Apr 24 16:43:11.989179 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:11.989144 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy-rules/0.log" Apr 24 16:43:12.189417 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:12.189341 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy-metrics/0.log" Apr 24 16:43:12.361353 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:12.361322 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerID="b709e703b44e91923f14815d92f635482a50ef5b687b7cb41b9124dba6c04a83" exitCode=0 Apr 24 16:43:12.361512 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:12.361395 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" event={"ID":"8e30bd49-e02e-4cb9-9908-d3dd4ede132a","Type":"ContainerDied","Data":"b709e703b44e91923f14815d92f635482a50ef5b687b7cb41b9124dba6c04a83"} Apr 24 16:43:12.361512 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:12.361434 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" event={"ID":"8e30bd49-e02e-4cb9-9908-d3dd4ede132a","Type":"ContainerStarted","Data":"89aa2c7fd8b9af6a9cb31c2ac972a0d2b0d41cda3a07335e3484f7d5cf09a433"} Apr 24 16:43:12.361762 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:12.361737 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:43:12.388553 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:12.388527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qsbsc_796ab223-e545-471e-9868-71174bdad1bf/networking-console-plugin/0.log" Apr 24 16:43:12.990247 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:12.990215 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d7fbd94b-xn8fc_0bd66365-6a23-4c9b-a037-0d6e249a056e/console/0.log" Apr 24 16:43:13.193460 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:13.193437 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4b845d79-db8xv_91eefffc-7351-4ac1-802a-4880b5632d16/console/0.log" Apr 24 16:43:13.391722 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:13.391648 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-zw6kr_e75cf70a-707a-43de-bb15-bdf89460c4ca/download-server/0.log" Apr 24 16:43:14.389205 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:14.389173 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-49ml9_7f2523fe-21a3-46f7-a03b-88e7ae991338/dns-node-resolver/0.log" Apr 24 16:43:14.992860 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:14.992832 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b969cbff9-f4cgw_8e30bd49-e02e-4cb9-9908-d3dd4ede132a/registry/1.log" Apr 24 16:43:15.189291 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:15.189267 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vvwg9_f09d386a-3466-46d1-a1d1-efb87cc77eba/node-ca/0.log" Apr 24 16:43:16.853236 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:16.853201 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:43:16.853667 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:16.853294 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:43:19.351228 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:19.351171 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:43:19.353518 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:19.353496 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f8b7b6cb-c76c-42e3-9193-9423bbd58047-metrics-certs\") pod \"network-metrics-daemon-tgkjm\" (UID: \"f8b7b6cb-c76c-42e3-9193-9423bbd58047\") " pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:43:19.508747 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:19.508711 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-xv2x8\"" Apr 24 16:43:19.516226 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:19.516207 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-tgkjm" Apr 24 16:43:19.652907 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:19.652830 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-tgkjm"] Apr 24 16:43:19.656476 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:43:19.656451 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b7b6cb_c76c_42e3_9193_9423bbd58047.slice/crio-fe875a6e0891233671daf453b23b340dff3f81ac82dd4302df8f09b102e1f880 WatchSource:0}: Error finding container fe875a6e0891233671daf453b23b340dff3f81ac82dd4302df8f09b102e1f880: Status 404 returned error can't find the container with id fe875a6e0891233671daf453b23b340dff3f81ac82dd4302df8f09b102e1f880 Apr 24 16:43:20.388460 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:20.388422 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tgkjm" event={"ID":"f8b7b6cb-c76c-42e3-9193-9423bbd58047","Type":"ContainerStarted","Data":"fe875a6e0891233671daf453b23b340dff3f81ac82dd4302df8f09b102e1f880"} Apr 24 16:43:21.392598 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:21.392564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tgkjm" event={"ID":"f8b7b6cb-c76c-42e3-9193-9423bbd58047","Type":"ContainerStarted","Data":"f2ab530ad3a30165e9c1ff23b11e32b07070cfcf0330d2059468c40f5eb73244"} Apr 24 16:43:21.392598 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:21.392599 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-tgkjm" event={"ID":"f8b7b6cb-c76c-42e3-9193-9423bbd58047","Type":"ContainerStarted","Data":"986f6d4f12ced4a3f3c4f6da5a2898d0697d33cec864c9756e08c70a0a3e8be7"} Apr 24 16:43:21.408408 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:21.408355 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-tgkjm" podStartSLOduration=253.534151726 podStartE2EDuration="4m14.408340972s" podCreationTimestamp="2026-04-24 16:39:07 +0000 UTC" firstStartedPulling="2026-04-24 16:43:19.658683007 +0000 UTC m=+252.649371803" lastFinishedPulling="2026-04-24 16:43:20.532872251 +0000 UTC m=+253.523561049" observedRunningTime="2026-04-24 16:43:21.406984343 +0000 UTC m=+254.397673181" watchObservedRunningTime="2026-04-24 16:43:21.408340972 +0000 UTC m=+254.399029789" Apr 24 16:43:21.735743 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:21.735707 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:43:21.735922 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:21.735759 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:43:25.682122 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:25.682034 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:25.682122 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:25.682092 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:43:26.852449 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:26.852414 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:43:26.852809 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:26.852465 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:43:31.735000 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:31.734964 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:43:31.735420 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:31.735019 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:43:33.369697 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:33.369668 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:33.370172 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:33.369723 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:43:35.681005 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:35.680974 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:35.681374 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:35.681029 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:43:36.852727 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:36.852687 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:43:36.853204 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:36.852740 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:43:41.735194 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:41.735157 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:43:41.735571 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:41.735214 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:43:43.369070 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:43.369037 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:43.369602 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:43.369085 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:43:45.680430 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:45.680401 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:45.680843 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:45.680454 2575 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:43:45.680843 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:45.680494 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:43:45.680967 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:45.680936 2575 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"89aa2c7fd8b9af6a9cb31c2ac972a0d2b0d41cda3a07335e3484f7d5cf09a433"} pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" containerMessage="Container registry failed liveness probe, will be restarted" Apr 24 16:43:45.684312 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:45.684281 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:45.684416 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:45.684319 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:43:46.852958 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:46.852920 2575 patch_prober.go:28] interesting pod/console-6d7fbd94b-xn8fc container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" start-of-body= Apr 24 16:43:46.853370 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:46.852979 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" probeResult="failure" output="Get \"https://10.132.0.16:8443/health\": dial tcp 10.132.0.16:8443: connect: connection refused" Apr 24 16:43:48.038759 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:43:48.038721 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-b68bd" podUID="25e3ae92-c693-4c50-b5ce-0ed6ad115edd" Apr 24 16:43:48.468765 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:48.468737 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b68bd" Apr 24 16:43:51.000726 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.000693 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:43:51.001093 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.000745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:43:51.003650 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.003626 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b0e3d259-e5e4-4160-8258-8d97913d476a-cert\") pod \"ingress-canary-5nfqq\" (UID: \"b0e3d259-e5e4-4160-8258-8d97913d476a\") " pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:43:51.004312 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.004289 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/25e3ae92-c693-4c50-b5ce-0ed6ad115edd-metrics-tls\") pod \"dns-default-b68bd\" (UID: \"25e3ae92-c693-4c50-b5ce-0ed6ad115edd\") " pod="openshift-dns/dns-default-b68bd" Apr 24 16:43:51.172395 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.172367 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-6n2jd\"" Apr 24 16:43:51.179837 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.179818 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-b68bd" Apr 24 16:43:51.209059 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.209027 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-szcnf\"" Apr 24 16:43:51.215667 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.215627 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5nfqq" Apr 24 16:43:51.311737 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.311711 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-b68bd"] Apr 24 16:43:51.312611 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:43:51.312583 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25e3ae92_c693_4c50_b5ce_0ed6ad115edd.slice/crio-c6e49f3561ce7927d0cdfef61de156b193b082d8071318cacd548f11e7b0b4b0 WatchSource:0}: Error finding container c6e49f3561ce7927d0cdfef61de156b193b082d8071318cacd548f11e7b0b4b0: Status 404 returned error can't find the container with id c6e49f3561ce7927d0cdfef61de156b193b082d8071318cacd548f11e7b0b4b0 Apr 24 16:43:51.344708 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.344688 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5nfqq"] Apr 24 16:43:51.346942 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:43:51.346917 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0e3d259_e5e4_4160_8258_8d97913d476a.slice/crio-955f29a4fdbb66ae0530311243996bfcc32b47ced13fb6a6b4bf9b6efb90cd8d WatchSource:0}: Error finding container 955f29a4fdbb66ae0530311243996bfcc32b47ced13fb6a6b4bf9b6efb90cd8d: Status 404 returned error can't find the container with id 955f29a4fdbb66ae0530311243996bfcc32b47ced13fb6a6b4bf9b6efb90cd8d Apr 24 16:43:51.477962 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.477927 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b68bd" event={"ID":"25e3ae92-c693-4c50-b5ce-0ed6ad115edd","Type":"ContainerStarted","Data":"c6e49f3561ce7927d0cdfef61de156b193b082d8071318cacd548f11e7b0b4b0"} Apr 24 16:43:51.478858 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.478839 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5nfqq" event={"ID":"b0e3d259-e5e4-4160-8258-8d97913d476a","Type":"ContainerStarted","Data":"955f29a4fdbb66ae0530311243996bfcc32b47ced13fb6a6b4bf9b6efb90cd8d"} Apr 24 16:43:51.735756 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.735705 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:43:51.735976 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:51.735781 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:43:53.487252 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:53.487219 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b68bd" event={"ID":"25e3ae92-c693-4c50-b5ce-0ed6ad115edd","Type":"ContainerStarted","Data":"0e060385f54aade4ca219801b365e52723efc350270e8dfcab6e7a60eb59b6f2"} Apr 24 16:43:53.487252 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:53.487256 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-b68bd" event={"ID":"25e3ae92-c693-4c50-b5ce-0ed6ad115edd","Type":"ContainerStarted","Data":"ee3a09167c8131ccf8914317fb92595d74750c3dbd3e2e52d97f2766b3a1063c"} Apr 24 16:43:53.487747 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:53.487308 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-b68bd" Apr 24 16:43:53.488627 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:53.488605 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5nfqq" event={"ID":"b0e3d259-e5e4-4160-8258-8d97913d476a","Type":"ContainerStarted","Data":"adb5b189db32226668330b71435bacb91283d80fc0195eb2d062bb7a1a1c8d17"} Apr 24 16:43:53.506691 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:53.506653 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-b68bd" podStartSLOduration=251.65375947 podStartE2EDuration="4m13.506641961s" podCreationTimestamp="2026-04-24 16:39:40 +0000 UTC" firstStartedPulling="2026-04-24 16:43:51.315062208 +0000 UTC m=+284.305751004" lastFinishedPulling="2026-04-24 16:43:53.167944699 +0000 UTC m=+286.158633495" observedRunningTime="2026-04-24 16:43:53.504765819 +0000 UTC m=+286.495454635" watchObservedRunningTime="2026-04-24 16:43:53.506641961 +0000 UTC m=+286.497330819" Apr 24 16:43:53.520253 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:53.520210 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5nfqq" podStartSLOduration=251.697754134 podStartE2EDuration="4m13.520197902s" podCreationTimestamp="2026-04-24 16:39:40 +0000 UTC" firstStartedPulling="2026-04-24 16:43:51.34884765 +0000 UTC m=+284.339536445" lastFinishedPulling="2026-04-24 16:43:53.171291416 +0000 UTC m=+286.161980213" observedRunningTime="2026-04-24 16:43:53.519356146 +0000 UTC m=+286.510044963" watchObservedRunningTime="2026-04-24 16:43:53.520197902 +0000 UTC m=+286.510886719" Apr 24 16:43:54.249377 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.249345 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d7fbd94b-xn8fc"] Apr 24 16:43:54.291962 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.291934 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b7c79cc4f-xzdfr"] Apr 24 16:43:54.295096 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.295081 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.306195 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.306175 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b7c79cc4f-xzdfr"] Apr 24 16:43:54.431781 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.431751 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-trusted-ca-bundle\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.431781 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.431781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-oauth-config\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.431934 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.431801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-serving-cert\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.431934 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.431850 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-oauth-serving-cert\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.431934 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.431894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-console-config\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.431934 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.431930 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmmv\" (UniqueName: \"kubernetes.io/projected/74947aed-ab5c-468a-8f5d-e8d2affda983-kube-api-access-4zmmv\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.432055 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.431978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-service-ca\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533030 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.532956 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-service-ca\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533030 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.532995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-trusted-ca-bundle\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533030 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.533012 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-oauth-config\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533030 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.533030 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-serving-cert\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.533046 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-oauth-serving-cert\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.533157 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-console-config\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533586 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.533323 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmmv\" (UniqueName: \"kubernetes.io/projected/74947aed-ab5c-468a-8f5d-e8d2affda983-kube-api-access-4zmmv\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533810 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.533789 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-service-ca\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.533949 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.533927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-oauth-serving-cert\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.534080 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.534061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-console-config\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.534147 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.534081 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-trusted-ca-bundle\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.535729 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.535705 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-serving-cert\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.535820 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.535732 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-oauth-config\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.542078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.542058 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmmv\" (UniqueName: \"kubernetes.io/projected/74947aed-ab5c-468a-8f5d-e8d2affda983-kube-api-access-4zmmv\") pod \"console-b7c79cc4f-xzdfr\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.603945 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.603919 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:43:54.725719 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:54.725664 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b7c79cc4f-xzdfr"] Apr 24 16:43:54.728189 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:43:54.728160 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74947aed_ab5c_468a_8f5d_e8d2affda983.slice/crio-b89a8653b078ec8de9fcaea6fca1f39f88724de9066eff38cfa2e0a33b5e3f02 WatchSource:0}: Error finding container b89a8653b078ec8de9fcaea6fca1f39f88724de9066eff38cfa2e0a33b5e3f02: Status 404 returned error can't find the container with id b89a8653b078ec8de9fcaea6fca1f39f88724de9066eff38cfa2e0a33b5e3f02 Apr 24 16:43:55.495320 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:55.495271 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7c79cc4f-xzdfr" event={"ID":"74947aed-ab5c-468a-8f5d-e8d2affda983","Type":"ContainerStarted","Data":"255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab"} Apr 24 16:43:55.495320 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:55.495320 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7c79cc4f-xzdfr" event={"ID":"74947aed-ab5c-468a-8f5d-e8d2affda983","Type":"ContainerStarted","Data":"b89a8653b078ec8de9fcaea6fca1f39f88724de9066eff38cfa2e0a33b5e3f02"} Apr 24 16:43:55.512654 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:55.512609 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b7c79cc4f-xzdfr" podStartSLOduration=1.512596244 podStartE2EDuration="1.512596244s" podCreationTimestamp="2026-04-24 16:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:43:55.510648374 +0000 UTC m=+288.501337190" watchObservedRunningTime="2026-04-24 16:43:55.512596244 +0000 UTC m=+288.503285062" Apr 24 16:43:55.684624 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:55.684596 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:43:55.685026 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:43:55.684644 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:44:01.735501 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:01.735464 2575 patch_prober.go:28] interesting pod/console-7d4b845d79-db8xv container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" start-of-body= Apr 24 16:44:01.735873 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:01.735517 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" probeResult="failure" output="Get \"https://10.132.0.19:8443/health\": dial tcp 10.132.0.19:8443: connect: connection refused" Apr 24 16:44:03.493975 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:03.493941 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-b68bd" Apr 24 16:44:04.523605 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.523528 2575 generic.go:358] "Generic (PLEG): container finished" podID="d0351ca2-fa39-42bc-b374-b53d08923ed5" containerID="f368d829c1e1e14e30cb97f8c3f4f13db078b4e8048414daf2acc370134f0fa2" exitCode=1 Apr 24 16:44:04.523974 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.523601 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" event={"ID":"d0351ca2-fa39-42bc-b374-b53d08923ed5","Type":"ContainerDied","Data":"f368d829c1e1e14e30cb97f8c3f4f13db078b4e8048414daf2acc370134f0fa2"} Apr 24 16:44:04.523974 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.523642 2575 scope.go:117] "RemoveContainer" containerID="66770dff0d93297d66535e9672376cb03e107912a76671bfb85b8242488fedd1" Apr 24 16:44:04.523974 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.523958 2575 scope.go:117] "RemoveContainer" containerID="f368d829c1e1e14e30cb97f8c3f4f13db078b4e8048414daf2acc370134f0fa2" Apr 24 16:44:04.524206 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:44:04.524188 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-68f9c6d688-t56dc_open-cluster-management-agent-addon(d0351ca2-fa39-42bc-b374-b53d08923ed5)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" podUID="d0351ca2-fa39-42bc-b374-b53d08923ed5" Apr 24 16:44:04.604985 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.604958 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:44:04.605085 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.605000 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:44:04.606475 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.606448 2575 patch_prober.go:28] interesting pod/console-b7c79cc4f-xzdfr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.132.0.20:8443/health\": dial tcp 10.132.0.20:8443: connect: connection refused" start-of-body= Apr 24 16:44:04.606563 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:04.606487 2575 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-b7c79cc4f-xzdfr" podUID="74947aed-ab5c-468a-8f5d-e8d2affda983" containerName="console" probeResult="failure" output="Get \"https://10.132.0.20:8443/health\": dial tcp 10.132.0.20:8443: connect: connection refused" Apr 24 16:44:05.002090 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.002058 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d4b845d79-db8xv"] Apr 24 16:44:05.027881 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.027853 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-546fd77464-7fnjz"] Apr 24 16:44:05.031205 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.031189 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.039863 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.039842 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546fd77464-7fnjz"] Apr 24 16:44:05.095491 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.095467 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:44:05.118865 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.118845 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-oauth-config\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.118925 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.118873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-console-config\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.118925 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.118894 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-trusted-ca-bundle\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.118925 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.118911 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p2hm\" (UniqueName: \"kubernetes.io/projected/0f48b270-f607-407c-af85-145d9d54d482-kube-api-access-2p2hm\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.119015 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.118934 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-serving-cert\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.119015 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.118972 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-service-ca\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.119015 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.118992 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-oauth-serving-cert\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.219548 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.219520 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-oauth-config\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.219668 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.219552 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-console-config\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.219668 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.219571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-trusted-ca-bundle\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.219668 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.219588 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2p2hm\" (UniqueName: \"kubernetes.io/projected/0f48b270-f607-407c-af85-145d9d54d482-kube-api-access-2p2hm\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.219668 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.219605 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-serving-cert\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.219668 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.219647 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-service-ca\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.219927 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.219678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-oauth-serving-cert\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.220379 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.220271 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-console-config\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.220379 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.220428 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-oauth-serving-cert\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.220618 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.220475 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-service-ca\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.220695 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.220668 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-trusted-ca-bundle\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.222279 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.222256 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-oauth-config\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.222345 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.222259 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-serving-cert\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.227908 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.227885 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p2hm\" (UniqueName: \"kubernetes.io/projected/0f48b270-f607-407c-af85-145d9d54d482-kube-api-access-2p2hm\") pod \"console-546fd77464-7fnjz\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.340903 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.340833 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:05.482554 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.482528 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-546fd77464-7fnjz"] Apr 24 16:44:05.485094 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:44:05.485066 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f48b270_f607_407c_af85_145d9d54d482.slice/crio-1c02b7ab434988bb3230d3f2da5aaf4dcacd82f4bbfe9d244e0db07d08d6782b WatchSource:0}: Error finding container 1c02b7ab434988bb3230d3f2da5aaf4dcacd82f4bbfe9d244e0db07d08d6782b: Status 404 returned error can't find the container with id 1c02b7ab434988bb3230d3f2da5aaf4dcacd82f4bbfe9d244e0db07d08d6782b Apr 24 16:44:05.527332 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.527310 2575 scope.go:117] "RemoveContainer" containerID="f368d829c1e1e14e30cb97f8c3f4f13db078b4e8048414daf2acc370134f0fa2" Apr 24 16:44:05.527670 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:44:05.527551 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-68f9c6d688-t56dc_open-cluster-management-agent-addon(d0351ca2-fa39-42bc-b374-b53d08923ed5)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" podUID="d0351ca2-fa39-42bc-b374-b53d08923ed5" Apr 24 16:44:05.528354 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.528328 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546fd77464-7fnjz" event={"ID":"0f48b270-f607-407c-af85-145d9d54d482","Type":"ContainerStarted","Data":"1c02b7ab434988bb3230d3f2da5aaf4dcacd82f4bbfe9d244e0db07d08d6782b"} Apr 24 16:44:05.684231 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.684160 2575 patch_prober.go:28] interesting pod/image-registry-b969cbff9-f4cgw container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 16:44:05.684231 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:05.684207 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 16:44:06.532612 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:06.532578 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546fd77464-7fnjz" event={"ID":"0f48b270-f607-407c-af85-145d9d54d482","Type":"ContainerStarted","Data":"3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd"} Apr 24 16:44:06.555485 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:06.555442 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-546fd77464-7fnjz" podStartSLOduration=1.555429535 podStartE2EDuration="1.555429535s" podCreationTimestamp="2026-04-24 16:44:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:44:06.554692579 +0000 UTC m=+299.545381398" watchObservedRunningTime="2026-04-24 16:44:06.555429535 +0000 UTC m=+299.546118352" Apr 24 16:44:07.404600 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:07.404527 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:44:07.405531 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:07.405481 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:44:07.411603 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:07.411583 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 16:44:08.094833 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:08.094799 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:44:08.097164 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:08.095221 2575 scope.go:117] "RemoveContainer" containerID="f368d829c1e1e14e30cb97f8c3f4f13db078b4e8048414daf2acc370134f0fa2" Apr 24 16:44:08.097164 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:44:08.095411 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"acm-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=acm-agent pod=klusterlet-addon-workmgr-68f9c6d688-t56dc_open-cluster-management-agent-addon(d0351ca2-fa39-42bc-b374-b53d08923ed5)\"" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" podUID="d0351ca2-fa39-42bc-b374-b53d08923ed5" Apr 24 16:44:10.699334 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:10.699293 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" podUID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerName="registry" containerID="cri-o://89aa2c7fd8b9af6a9cb31c2ac972a0d2b0d41cda3a07335e3484f7d5cf09a433" gracePeriod=30 Apr 24 16:44:11.812745 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:11.812726 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:44:12.553431 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:12.553391 2575 generic.go:358] "Generic (PLEG): container finished" podID="8e30bd49-e02e-4cb9-9908-d3dd4ede132a" containerID="89aa2c7fd8b9af6a9cb31c2ac972a0d2b0d41cda3a07335e3484f7d5cf09a433" exitCode=0 Apr 24 16:44:12.553610 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:12.553463 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" event={"ID":"8e30bd49-e02e-4cb9-9908-d3dd4ede132a","Type":"ContainerDied","Data":"89aa2c7fd8b9af6a9cb31c2ac972a0d2b0d41cda3a07335e3484f7d5cf09a433"} Apr 24 16:44:12.553610 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:12.553497 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" event={"ID":"8e30bd49-e02e-4cb9-9908-d3dd4ede132a","Type":"ContainerStarted","Data":"ff65f0401bed7c003015eda75e4a6402c472ee28da532e6f04573bbbebd373aa"} Apr 24 16:44:12.553610 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:12.553511 2575 scope.go:117] "RemoveContainer" containerID="b709e703b44e91923f14815d92f635482a50ef5b687b7cb41b9124dba6c04a83" Apr 24 16:44:12.553754 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:12.553686 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:44:14.608025 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:14.607993 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:44:14.611838 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:14.611821 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:44:15.341350 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:15.341316 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:15.341350 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:15.341355 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:15.345519 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:15.345497 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:15.568301 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:15.568265 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:44:15.616089 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:15.616005 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b7c79cc4f-xzdfr"] Apr 24 16:44:19.268152 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.268088 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6d7fbd94b-xn8fc" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" containerID="cri-o://deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c" gracePeriod=15 Apr 24 16:44:19.504700 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.504671 2575 scope.go:117] "RemoveContainer" containerID="f368d829c1e1e14e30cb97f8c3f4f13db078b4e8048414daf2acc370134f0fa2" Apr 24 16:44:19.505861 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.505842 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d7fbd94b-xn8fc_0bd66365-6a23-4c9b-a037-0d6e249a056e/console/0.log" Apr 24 16:44:19.505958 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.505896 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:44:19.533938 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.533908 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-oauth-config\") pod \"0bd66365-6a23-4c9b-a037-0d6e249a056e\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " Apr 24 16:44:19.534033 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.533961 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/0bd66365-6a23-4c9b-a037-0d6e249a056e-kube-api-access-v5fvc\") pod \"0bd66365-6a23-4c9b-a037-0d6e249a056e\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " Apr 24 16:44:19.534033 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534002 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-trusted-ca-bundle\") pod \"0bd66365-6a23-4c9b-a037-0d6e249a056e\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " Apr 24 16:44:19.534172 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534053 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-serving-cert\") pod \"0bd66365-6a23-4c9b-a037-0d6e249a056e\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " Apr 24 16:44:19.534172 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534091 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-config\") pod \"0bd66365-6a23-4c9b-a037-0d6e249a056e\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " Apr 24 16:44:19.534172 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534165 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-service-ca\") pod \"0bd66365-6a23-4c9b-a037-0d6e249a056e\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " Apr 24 16:44:19.534321 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534191 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-oauth-serving-cert\") pod \"0bd66365-6a23-4c9b-a037-0d6e249a056e\" (UID: \"0bd66365-6a23-4c9b-a037-0d6e249a056e\") " Apr 24 16:44:19.534704 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534678 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0bd66365-6a23-4c9b-a037-0d6e249a056e" (UID: "0bd66365-6a23-4c9b-a037-0d6e249a056e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:19.535011 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534541 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0bd66365-6a23-4c9b-a037-0d6e249a056e" (UID: "0bd66365-6a23-4c9b-a037-0d6e249a056e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:19.535011 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534953 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-trusted-ca-bundle\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:19.535011 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534959 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-config" (OuterVolumeSpecName: "console-config") pod "0bd66365-6a23-4c9b-a037-0d6e249a056e" (UID: "0bd66365-6a23-4c9b-a037-0d6e249a056e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:19.535011 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534970 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-oauth-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:19.535011 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.534790 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-service-ca" (OuterVolumeSpecName: "service-ca") pod "0bd66365-6a23-4c9b-a037-0d6e249a056e" (UID: "0bd66365-6a23-4c9b-a037-0d6e249a056e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:19.536502 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.536473 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0bd66365-6a23-4c9b-a037-0d6e249a056e" (UID: "0bd66365-6a23-4c9b-a037-0d6e249a056e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:19.536502 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.536484 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bd66365-6a23-4c9b-a037-0d6e249a056e-kube-api-access-v5fvc" (OuterVolumeSpecName: "kube-api-access-v5fvc") pod "0bd66365-6a23-4c9b-a037-0d6e249a056e" (UID: "0bd66365-6a23-4c9b-a037-0d6e249a056e"). InnerVolumeSpecName "kube-api-access-v5fvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:44:19.537316 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.537300 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0bd66365-6a23-4c9b-a037-0d6e249a056e" (UID: "0bd66365-6a23-4c9b-a037-0d6e249a056e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:19.583993 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.583961 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d7fbd94b-xn8fc_0bd66365-6a23-4c9b-a037-0d6e249a056e/console/0.log" Apr 24 16:44:19.584089 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.584015 2575 generic.go:358] "Generic (PLEG): container finished" podID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerID="deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c" exitCode=2 Apr 24 16:44:19.584159 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.584130 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d7fbd94b-xn8fc" event={"ID":"0bd66365-6a23-4c9b-a037-0d6e249a056e","Type":"ContainerDied","Data":"deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c"} Apr 24 16:44:19.584205 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.584171 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d7fbd94b-xn8fc" event={"ID":"0bd66365-6a23-4c9b-a037-0d6e249a056e","Type":"ContainerDied","Data":"a263775cf61ba6d5250aa13fd65144e2947372ebed81a2c0add59190a8dcce4b"} Apr 24 16:44:19.584205 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.584190 2575 scope.go:117] "RemoveContainer" containerID="deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c" Apr 24 16:44:19.584285 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.584204 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d7fbd94b-xn8fc" Apr 24 16:44:19.592180 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.592160 2575 scope.go:117] "RemoveContainer" containerID="deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c" Apr 24 16:44:19.592553 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:44:19.592446 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c\": container with ID starting with deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c not found: ID does not exist" containerID="deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c" Apr 24 16:44:19.592553 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.592482 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c"} err="failed to get container status \"deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c\": rpc error: code = NotFound desc = could not find container \"deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c\": container with ID starting with deae284aa90fbc0d9db16e58c231f7a471f4a09b03ba8913c722e85510ec5e5c not found: ID does not exist" Apr 24 16:44:19.605880 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.605856 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d7fbd94b-xn8fc"] Apr 24 16:44:19.609741 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.609718 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d7fbd94b-xn8fc"] Apr 24 16:44:19.635558 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.635522 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-oauth-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:19.635679 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.635561 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5fvc\" (UniqueName: \"kubernetes.io/projected/0bd66365-6a23-4c9b-a037-0d6e249a056e-kube-api-access-v5fvc\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:19.635679 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.635578 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:19.635679 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.635592 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-console-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:19.635679 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:19.635604 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bd66365-6a23-4c9b-a037-0d6e249a056e-service-ca\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:20.588827 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:20.588792 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" event={"ID":"d0351ca2-fa39-42bc-b374-b53d08923ed5","Type":"ContainerStarted","Data":"826fe90d02aee9ab96abda512ed7cbcaa7dc746f1ca2c1d5b11575f10dcb8267"} Apr 24 16:44:20.589296 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:20.589138 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:44:20.590648 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:20.590629 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-68f9c6d688-t56dc" Apr 24 16:44:21.509366 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:21.509335 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" path="/var/lib/kubelet/pods/0bd66365-6a23-4c9b-a037-0d6e249a056e/volumes" Apr 24 16:44:30.021122 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.021035 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7d4b845d79-db8xv" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" containerID="cri-o://8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79" gracePeriod=15 Apr 24 16:44:30.259599 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.259576 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4b845d79-db8xv_91eefffc-7351-4ac1-802a-4880b5632d16/console/0.log" Apr 24 16:44:30.259711 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.259636 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:44:30.319179 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319076 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-service-ca\") pod \"91eefffc-7351-4ac1-802a-4880b5632d16\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " Apr 24 16:44:30.319179 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319151 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-trusted-ca-bundle\") pod \"91eefffc-7351-4ac1-802a-4880b5632d16\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " Apr 24 16:44:30.319385 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319199 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-oauth-config\") pod \"91eefffc-7351-4ac1-802a-4880b5632d16\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " Apr 24 16:44:30.319385 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319222 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cstxj\" (UniqueName: \"kubernetes.io/projected/91eefffc-7351-4ac1-802a-4880b5632d16-kube-api-access-cstxj\") pod \"91eefffc-7351-4ac1-802a-4880b5632d16\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " Apr 24 16:44:30.319385 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319253 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-serving-cert\") pod \"91eefffc-7351-4ac1-802a-4880b5632d16\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " Apr 24 16:44:30.319385 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319287 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-console-config\") pod \"91eefffc-7351-4ac1-802a-4880b5632d16\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " Apr 24 16:44:30.319385 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319337 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-oauth-serving-cert\") pod \"91eefffc-7351-4ac1-802a-4880b5632d16\" (UID: \"91eefffc-7351-4ac1-802a-4880b5632d16\") " Apr 24 16:44:30.319657 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319615 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-service-ca" (OuterVolumeSpecName: "service-ca") pod "91eefffc-7351-4ac1-802a-4880b5632d16" (UID: "91eefffc-7351-4ac1-802a-4880b5632d16"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:30.319706 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319654 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "91eefffc-7351-4ac1-802a-4880b5632d16" (UID: "91eefffc-7351-4ac1-802a-4880b5632d16"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:30.319762 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319735 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-console-config" (OuterVolumeSpecName: "console-config") pod "91eefffc-7351-4ac1-802a-4880b5632d16" (UID: "91eefffc-7351-4ac1-802a-4880b5632d16"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:30.319833 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.319816 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "91eefffc-7351-4ac1-802a-4880b5632d16" (UID: "91eefffc-7351-4ac1-802a-4880b5632d16"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:30.321702 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.321670 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "91eefffc-7351-4ac1-802a-4880b5632d16" (UID: "91eefffc-7351-4ac1-802a-4880b5632d16"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:30.321872 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.321851 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "91eefffc-7351-4ac1-802a-4880b5632d16" (UID: "91eefffc-7351-4ac1-802a-4880b5632d16"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:30.321966 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.321947 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91eefffc-7351-4ac1-802a-4880b5632d16-kube-api-access-cstxj" (OuterVolumeSpecName: "kube-api-access-cstxj") pod "91eefffc-7351-4ac1-802a-4880b5632d16" (UID: "91eefffc-7351-4ac1-802a-4880b5632d16"). InnerVolumeSpecName "kube-api-access-cstxj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:44:30.420724 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.420686 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:30.420724 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.420717 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-console-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:30.420724 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.420727 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-oauth-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:30.420724 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.420735 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-service-ca\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:30.420961 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.420744 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/91eefffc-7351-4ac1-802a-4880b5632d16-trusted-ca-bundle\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:30.420961 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.420752 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/91eefffc-7351-4ac1-802a-4880b5632d16-console-oauth-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:30.420961 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.420760 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cstxj\" (UniqueName: \"kubernetes.io/projected/91eefffc-7351-4ac1-802a-4880b5632d16-kube-api-access-cstxj\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:30.619804 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.619733 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7d4b845d79-db8xv_91eefffc-7351-4ac1-802a-4880b5632d16/console/0.log" Apr 24 16:44:30.619804 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.619778 2575 generic.go:358] "Generic (PLEG): container finished" podID="91eefffc-7351-4ac1-802a-4880b5632d16" containerID="8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79" exitCode=2 Apr 24 16:44:30.619995 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.619848 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7d4b845d79-db8xv" Apr 24 16:44:30.619995 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.619862 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4b845d79-db8xv" event={"ID":"91eefffc-7351-4ac1-802a-4880b5632d16","Type":"ContainerDied","Data":"8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79"} Apr 24 16:44:30.619995 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.619899 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7d4b845d79-db8xv" event={"ID":"91eefffc-7351-4ac1-802a-4880b5632d16","Type":"ContainerDied","Data":"cd48fc920711ec79e83a146abdbd2e692db48a9947eb8312f683e38338475b5c"} Apr 24 16:44:30.619995 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.619914 2575 scope.go:117] "RemoveContainer" containerID="8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79" Apr 24 16:44:30.628443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.628427 2575 scope.go:117] "RemoveContainer" containerID="8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79" Apr 24 16:44:30.628710 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:44:30.628689 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79\": container with ID starting with 8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79 not found: ID does not exist" containerID="8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79" Apr 24 16:44:30.628796 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.628717 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79"} err="failed to get container status \"8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79\": rpc error: code = NotFound desc = could not find container \"8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79\": container with ID starting with 8c80a1da651d25f815537331d51812ebc38db54898b26edd5e8e5e14dcc8fa79 not found: ID does not exist" Apr 24 16:44:30.641414 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.641388 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7d4b845d79-db8xv"] Apr 24 16:44:30.643221 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:30.643202 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7d4b845d79-db8xv"] Apr 24 16:44:31.508748 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:31.508716 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" path="/var/lib/kubelet/pods/91eefffc-7351-4ac1-802a-4880b5632d16/volumes" Apr 24 16:44:33.561511 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:33.561480 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b969cbff9-f4cgw" Apr 24 16:44:41.586061 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.586001 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b7c79cc4f-xzdfr" podUID="74947aed-ab5c-468a-8f5d-e8d2affda983" containerName="console" containerID="cri-o://255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab" gracePeriod=15 Apr 24 16:44:41.828930 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.828909 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b7c79cc4f-xzdfr_74947aed-ab5c-468a-8f5d-e8d2affda983/console/0.log" Apr 24 16:44:41.829035 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.828967 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:44:41.912848 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.912769 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-serving-cert\") pod \"74947aed-ab5c-468a-8f5d-e8d2affda983\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " Apr 24 16:44:41.912848 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.912812 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-trusted-ca-bundle\") pod \"74947aed-ab5c-468a-8f5d-e8d2affda983\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " Apr 24 16:44:41.912848 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.912841 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-oauth-serving-cert\") pod \"74947aed-ab5c-468a-8f5d-e8d2affda983\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " Apr 24 16:44:41.913075 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.912875 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-oauth-config\") pod \"74947aed-ab5c-468a-8f5d-e8d2affda983\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " Apr 24 16:44:41.913075 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.912928 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-console-config\") pod \"74947aed-ab5c-468a-8f5d-e8d2affda983\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " Apr 24 16:44:41.913075 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.912954 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-service-ca\") pod \"74947aed-ab5c-468a-8f5d-e8d2affda983\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " Apr 24 16:44:41.913075 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.912989 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zmmv\" (UniqueName: \"kubernetes.io/projected/74947aed-ab5c-468a-8f5d-e8d2affda983-kube-api-access-4zmmv\") pod \"74947aed-ab5c-468a-8f5d-e8d2affda983\" (UID: \"74947aed-ab5c-468a-8f5d-e8d2affda983\") " Apr 24 16:44:41.913324 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.913290 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "74947aed-ab5c-468a-8f5d-e8d2affda983" (UID: "74947aed-ab5c-468a-8f5d-e8d2affda983"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:41.913440 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.913306 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "74947aed-ab5c-468a-8f5d-e8d2affda983" (UID: "74947aed-ab5c-468a-8f5d-e8d2affda983"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:41.913440 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.913309 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-console-config" (OuterVolumeSpecName: "console-config") pod "74947aed-ab5c-468a-8f5d-e8d2affda983" (UID: "74947aed-ab5c-468a-8f5d-e8d2affda983"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:41.913440 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.913417 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-console-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:41.913440 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.913437 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-trusted-ca-bundle\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:41.913605 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.913452 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-oauth-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:41.913605 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.913544 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-service-ca" (OuterVolumeSpecName: "service-ca") pod "74947aed-ab5c-468a-8f5d-e8d2affda983" (UID: "74947aed-ab5c-468a-8f5d-e8d2affda983"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:44:41.915270 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.915246 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74947aed-ab5c-468a-8f5d-e8d2affda983-kube-api-access-4zmmv" (OuterVolumeSpecName: "kube-api-access-4zmmv") pod "74947aed-ab5c-468a-8f5d-e8d2affda983" (UID: "74947aed-ab5c-468a-8f5d-e8d2affda983"). InnerVolumeSpecName "kube-api-access-4zmmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:44:41.915579 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.915550 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "74947aed-ab5c-468a-8f5d-e8d2affda983" (UID: "74947aed-ab5c-468a-8f5d-e8d2affda983"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:41.915579 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:41.915573 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "74947aed-ab5c-468a-8f5d-e8d2affda983" (UID: "74947aed-ab5c-468a-8f5d-e8d2affda983"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:44:42.014168 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.014134 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-oauth-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:42.014168 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.014169 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/74947aed-ab5c-468a-8f5d-e8d2affda983-service-ca\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:42.014168 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.014179 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4zmmv\" (UniqueName: \"kubernetes.io/projected/74947aed-ab5c-468a-8f5d-e8d2affda983-kube-api-access-4zmmv\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:42.014334 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.014188 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/74947aed-ab5c-468a-8f5d-e8d2affda983-console-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:44:42.656989 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.656958 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b7c79cc4f-xzdfr_74947aed-ab5c-468a-8f5d-e8d2affda983/console/0.log" Apr 24 16:44:42.657466 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.657000 2575 generic.go:358] "Generic (PLEG): container finished" podID="74947aed-ab5c-468a-8f5d-e8d2affda983" containerID="255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab" exitCode=2 Apr 24 16:44:42.657466 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.657055 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7c79cc4f-xzdfr" event={"ID":"74947aed-ab5c-468a-8f5d-e8d2affda983","Type":"ContainerDied","Data":"255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab"} Apr 24 16:44:42.657466 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.657071 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b7c79cc4f-xzdfr" Apr 24 16:44:42.657466 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.657099 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b7c79cc4f-xzdfr" event={"ID":"74947aed-ab5c-468a-8f5d-e8d2affda983","Type":"ContainerDied","Data":"b89a8653b078ec8de9fcaea6fca1f39f88724de9066eff38cfa2e0a33b5e3f02"} Apr 24 16:44:42.657466 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.657135 2575 scope.go:117] "RemoveContainer" containerID="255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab" Apr 24 16:44:42.665651 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.665631 2575 scope.go:117] "RemoveContainer" containerID="255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab" Apr 24 16:44:42.665895 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:44:42.665875 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab\": container with ID starting with 255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab not found: ID does not exist" containerID="255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab" Apr 24 16:44:42.665944 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.665904 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab"} err="failed to get container status \"255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab\": rpc error: code = NotFound desc = could not find container \"255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab\": container with ID starting with 255b8bd8f11f87ea0498ab89d70c0cd88a4c755540fd83d5fd1dcb4d9f7259ab not found: ID does not exist" Apr 24 16:44:42.679933 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.679912 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b7c79cc4f-xzdfr"] Apr 24 16:44:42.683520 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:42.683503 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b7c79cc4f-xzdfr"] Apr 24 16:44:43.508954 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:44:43.508922 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74947aed-ab5c-468a-8f5d-e8d2affda983" path="/var/lib/kubelet/pods/74947aed-ab5c-468a-8f5d-e8d2affda983/volumes" Apr 24 16:48:59.522256 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522222 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5f654d599-gvphk"] Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522487 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522497 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522518 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74947aed-ab5c-468a-8f5d-e8d2affda983" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522523 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="74947aed-ab5c-468a-8f5d-e8d2affda983" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522534 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522538 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522586 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bd66365-6a23-4c9b-a037-0d6e249a056e" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522593 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="74947aed-ab5c-468a-8f5d-e8d2affda983" containerName="console" Apr 24 16:48:59.522656 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.522603 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="91eefffc-7351-4ac1-802a-4880b5632d16" containerName="console" Apr 24 16:48:59.525265 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.525248 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.541429 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.541400 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f654d599-gvphk"] Apr 24 16:48:59.606151 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.606093 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-serving-cert\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.606319 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.606163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-oauth-config\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.606319 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.606183 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-config\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.606319 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.606201 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-service-ca\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.606319 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.606240 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-oauth-serving-cert\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.606319 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.606292 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-trusted-ca-bundle\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.606479 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.606333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djjp8\" (UniqueName: \"kubernetes.io/projected/2991c990-e1a8-4649-95a7-e39b3ba6f92a-kube-api-access-djjp8\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.707418 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.707384 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-serving-cert\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.707569 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.707432 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-oauth-config\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.707569 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.707452 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-config\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.707569 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.707472 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-service-ca\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.707569 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.707510 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-oauth-serving-cert\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.707569 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.707526 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-trusted-ca-bundle\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.707569 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.707549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-djjp8\" (UniqueName: \"kubernetes.io/projected/2991c990-e1a8-4649-95a7-e39b3ba6f92a-kube-api-access-djjp8\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.708774 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.708580 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-config\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.708916 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.708809 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-trusted-ca-bundle\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.708916 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.708819 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-oauth-serving-cert\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.709036 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.708942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2991c990-e1a8-4649-95a7-e39b3ba6f92a-service-ca\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.710870 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.710837 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-oauth-config\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.715550 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.715532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2991c990-e1a8-4649-95a7-e39b3ba6f92a-console-serving-cert\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.719269 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.719240 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-djjp8\" (UniqueName: \"kubernetes.io/projected/2991c990-e1a8-4649-95a7-e39b3ba6f92a-kube-api-access-djjp8\") pod \"console-5f654d599-gvphk\" (UID: \"2991c990-e1a8-4649-95a7-e39b3ba6f92a\") " pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.834222 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.834148 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:48:59.961585 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:48:59.961562 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5f654d599-gvphk"] Apr 24 16:48:59.964051 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:48:59.964024 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2991c990_e1a8_4649_95a7_e39b3ba6f92a.slice/crio-9e07ce9bac5d5d84a215f10df99c9716c69cf21b94878c421250e63f02c2d1d9 WatchSource:0}: Error finding container 9e07ce9bac5d5d84a215f10df99c9716c69cf21b94878c421250e63f02c2d1d9: Status 404 returned error can't find the container with id 9e07ce9bac5d5d84a215f10df99c9716c69cf21b94878c421250e63f02c2d1d9 Apr 24 16:49:00.362517 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:00.362480 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f654d599-gvphk" event={"ID":"2991c990-e1a8-4649-95a7-e39b3ba6f92a","Type":"ContainerStarted","Data":"45985ff8dd22c262feef7471d4138e72eb1855a0681526a494fea8ca157faf36"} Apr 24 16:49:00.362684 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:00.362524 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5f654d599-gvphk" event={"ID":"2991c990-e1a8-4649-95a7-e39b3ba6f92a","Type":"ContainerStarted","Data":"9e07ce9bac5d5d84a215f10df99c9716c69cf21b94878c421250e63f02c2d1d9"} Apr 24 16:49:00.403241 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:00.403199 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5f654d599-gvphk" podStartSLOduration=1.403185824 podStartE2EDuration="1.403185824s" podCreationTimestamp="2026-04-24 16:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:49:00.400061141 +0000 UTC m=+593.390749969" watchObservedRunningTime="2026-04-24 16:49:00.403185824 +0000 UTC m=+593.393874643" Apr 24 16:49:07.426014 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:07.425984 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:49:07.427842 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:07.427821 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:49:09.834497 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:09.834451 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:49:09.836849 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:09.834540 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:49:09.839256 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:09.839239 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:49:10.394822 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:10.394790 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5f654d599-gvphk" Apr 24 16:49:10.442445 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:10.442412 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546fd77464-7fnjz"] Apr 24 16:49:35.462217 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.462136 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-546fd77464-7fnjz" podUID="0f48b270-f607-407c-af85-145d9d54d482" containerName="console" containerID="cri-o://3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd" gracePeriod=15 Apr 24 16:49:35.565624 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.565592 2575 patch_prober.go:28] interesting pod/console-546fd77464-7fnjz container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.132.0.21:8443/health\": dial tcp 10.132.0.21:8443: connect: connection refused" start-of-body= Apr 24 16:49:35.565748 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.565652 2575 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console/console-546fd77464-7fnjz" podUID="0f48b270-f607-407c-af85-145d9d54d482" containerName="console" probeResult="failure" output="Get \"https://10.132.0.21:8443/health\": dial tcp 10.132.0.21:8443: connect: connection refused" Apr 24 16:49:35.695471 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.695448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546fd77464-7fnjz_0f48b270-f607-407c-af85-145d9d54d482/console/0.log" Apr 24 16:49:35.695609 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.695531 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:49:35.890215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890124 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-trusted-ca-bundle\") pod \"0f48b270-f607-407c-af85-145d9d54d482\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " Apr 24 16:49:35.890215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890181 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-oauth-serving-cert\") pod \"0f48b270-f607-407c-af85-145d9d54d482\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " Apr 24 16:49:35.890215 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890207 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-service-ca\") pod \"0f48b270-f607-407c-af85-145d9d54d482\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " Apr 24 16:49:35.890475 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890235 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-oauth-config\") pod \"0f48b270-f607-407c-af85-145d9d54d482\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " Apr 24 16:49:35.890475 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890260 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p2hm\" (UniqueName: \"kubernetes.io/projected/0f48b270-f607-407c-af85-145d9d54d482-kube-api-access-2p2hm\") pod \"0f48b270-f607-407c-af85-145d9d54d482\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " Apr 24 16:49:35.890475 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890357 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-serving-cert\") pod \"0f48b270-f607-407c-af85-145d9d54d482\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " Apr 24 16:49:35.890475 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890400 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-console-config\") pod \"0f48b270-f607-407c-af85-145d9d54d482\" (UID: \"0f48b270-f607-407c-af85-145d9d54d482\") " Apr 24 16:49:35.890676 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890637 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0f48b270-f607-407c-af85-145d9d54d482" (UID: "0f48b270-f607-407c-af85-145d9d54d482"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:35.890676 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890652 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0f48b270-f607-407c-af85-145d9d54d482" (UID: "0f48b270-f607-407c-af85-145d9d54d482"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:35.890768 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890667 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-service-ca" (OuterVolumeSpecName: "service-ca") pod "0f48b270-f607-407c-af85-145d9d54d482" (UID: "0f48b270-f607-407c-af85-145d9d54d482"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:35.890918 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.890892 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-console-config" (OuterVolumeSpecName: "console-config") pod "0f48b270-f607-407c-af85-145d9d54d482" (UID: "0f48b270-f607-407c-af85-145d9d54d482"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 16:49:35.892636 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.892598 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0f48b270-f607-407c-af85-145d9d54d482" (UID: "0f48b270-f607-407c-af85-145d9d54d482"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:49:35.892733 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.892666 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0f48b270-f607-407c-af85-145d9d54d482" (UID: "0f48b270-f607-407c-af85-145d9d54d482"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 16:49:35.892840 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.892816 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f48b270-f607-407c-af85-145d9d54d482-kube-api-access-2p2hm" (OuterVolumeSpecName: "kube-api-access-2p2hm") pod "0f48b270-f607-407c-af85-145d9d54d482" (UID: "0f48b270-f607-407c-af85-145d9d54d482"). InnerVolumeSpecName "kube-api-access-2p2hm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 16:49:35.991316 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.991288 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:49:35.991316 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.991312 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-console-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:49:35.991316 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.991321 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-trusted-ca-bundle\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:49:35.991502 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.991330 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-oauth-serving-cert\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:49:35.991502 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.991340 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f48b270-f607-407c-af85-145d9d54d482-service-ca\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:49:35.991502 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.991349 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f48b270-f607-407c-af85-145d9d54d482-console-oauth-config\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:49:35.991502 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:35.991357 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2p2hm\" (UniqueName: \"kubernetes.io/projected/0f48b270-f607-407c-af85-145d9d54d482-kube-api-access-2p2hm\") on node \"ip-10-0-137-83.ec2.internal\" DevicePath \"\"" Apr 24 16:49:36.462096 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.462067 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-546fd77464-7fnjz_0f48b270-f607-407c-af85-145d9d54d482/console/0.log" Apr 24 16:49:36.462297 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.462124 2575 generic.go:358] "Generic (PLEG): container finished" podID="0f48b270-f607-407c-af85-145d9d54d482" containerID="3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd" exitCode=2 Apr 24 16:49:36.462297 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.462191 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546fd77464-7fnjz" event={"ID":"0f48b270-f607-407c-af85-145d9d54d482","Type":"ContainerDied","Data":"3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd"} Apr 24 16:49:36.462297 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.462198 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-546fd77464-7fnjz" Apr 24 16:49:36.462297 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.462218 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-546fd77464-7fnjz" event={"ID":"0f48b270-f607-407c-af85-145d9d54d482","Type":"ContainerDied","Data":"1c02b7ab434988bb3230d3f2da5aaf4dcacd82f4bbfe9d244e0db07d08d6782b"} Apr 24 16:49:36.462297 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.462233 2575 scope.go:117] "RemoveContainer" containerID="3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd" Apr 24 16:49:36.470987 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.470971 2575 scope.go:117] "RemoveContainer" containerID="3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd" Apr 24 16:49:36.471269 ip-10-0-137-83 kubenswrapper[2575]: E0424 16:49:36.471242 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd\": container with ID starting with 3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd not found: ID does not exist" containerID="3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd" Apr 24 16:49:36.471334 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.471281 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd"} err="failed to get container status \"3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd\": rpc error: code = NotFound desc = could not find container \"3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd\": container with ID starting with 3a81beb126bb09b04f3fc6a81163a57753fd626677d617b05a0cfe428b19edcd not found: ID does not exist" Apr 24 16:49:36.484389 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.484368 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-546fd77464-7fnjz"] Apr 24 16:49:36.488235 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:36.488213 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-546fd77464-7fnjz"] Apr 24 16:49:37.510519 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:49:37.510487 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f48b270-f607-407c-af85-145d9d54d482" path="/var/lib/kubelet/pods/0f48b270-f607-407c-af85-145d9d54d482/volumes" Apr 24 16:54:07.445200 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:07.445122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:54:07.448465 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:07.448440 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-m22zz_6e8405b8-571c-4fb5-8e11-7148ed4e4115/ovn-acl-logging/0.log" Apr 24 16:54:28.671471 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.671433 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wz7wm/must-gather-rwx6t"] Apr 24 16:54:28.673822 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.671739 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f48b270-f607-407c-af85-145d9d54d482" containerName="console" Apr 24 16:54:28.673822 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.671750 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f48b270-f607-407c-af85-145d9d54d482" containerName="console" Apr 24 16:54:28.673822 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.671813 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f48b270-f607-407c-af85-145d9d54d482" containerName="console" Apr 24 16:54:28.674721 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.674705 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:28.677418 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.677393 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wz7wm\"/\"openshift-service-ca.crt\"" Apr 24 16:54:28.678467 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.678444 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wz7wm\"/\"default-dockercfg-wj4t8\"" Apr 24 16:54:28.678585 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.678498 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wz7wm\"/\"kube-root-ca.crt\"" Apr 24 16:54:28.683566 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.683546 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wz7wm/must-gather-rwx6t"] Apr 24 16:54:28.798379 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.798350 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgt27\" (UniqueName: \"kubernetes.io/projected/82747b04-96ca-4ffe-a9b5-9e5be4ad5b74-kube-api-access-bgt27\") pod \"must-gather-rwx6t\" (UID: \"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74\") " pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:28.798379 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.798385 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82747b04-96ca-4ffe-a9b5-9e5be4ad5b74-must-gather-output\") pod \"must-gather-rwx6t\" (UID: \"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74\") " pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:28.899415 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.899366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bgt27\" (UniqueName: \"kubernetes.io/projected/82747b04-96ca-4ffe-a9b5-9e5be4ad5b74-kube-api-access-bgt27\") pod \"must-gather-rwx6t\" (UID: \"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74\") " pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:28.899415 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.899423 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82747b04-96ca-4ffe-a9b5-9e5be4ad5b74-must-gather-output\") pod \"must-gather-rwx6t\" (UID: \"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74\") " pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:28.899790 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.899770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/82747b04-96ca-4ffe-a9b5-9e5be4ad5b74-must-gather-output\") pod \"must-gather-rwx6t\" (UID: \"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74\") " pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:28.907972 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.907625 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgt27\" (UniqueName: \"kubernetes.io/projected/82747b04-96ca-4ffe-a9b5-9e5be4ad5b74-kube-api-access-bgt27\") pod \"must-gather-rwx6t\" (UID: \"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74\") " pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:28.984456 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:28.984430 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wz7wm/must-gather-rwx6t" Apr 24 16:54:29.101632 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:29.101607 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wz7wm/must-gather-rwx6t"] Apr 24 16:54:29.104312 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:54:29.104283 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82747b04_96ca_4ffe_a9b5_9e5be4ad5b74.slice/crio-c327b992376190d18234e2824d27e2803e4729c8a56628b7af4e9a42e3fdc6d4 WatchSource:0}: Error finding container c327b992376190d18234e2824d27e2803e4729c8a56628b7af4e9a42e3fdc6d4: Status 404 returned error can't find the container with id c327b992376190d18234e2824d27e2803e4729c8a56628b7af4e9a42e3fdc6d4 Apr 24 16:54:29.106014 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:29.105998 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 16:54:29.267100 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:29.267023 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wz7wm/must-gather-rwx6t" event={"ID":"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74","Type":"ContainerStarted","Data":"c327b992376190d18234e2824d27e2803e4729c8a56628b7af4e9a42e3fdc6d4"} Apr 24 16:54:30.272005 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:30.271964 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wz7wm/must-gather-rwx6t" event={"ID":"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74","Type":"ContainerStarted","Data":"3d9e551dd82bb59d09da3a2520413653658a99429c2ab4b985646351a337bd5e"} Apr 24 16:54:30.272005 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:30.272005 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wz7wm/must-gather-rwx6t" event={"ID":"82747b04-96ca-4ffe-a9b5-9e5be4ad5b74","Type":"ContainerStarted","Data":"eca1cd8d3dda73a9b7c461549afac5906f7c79d72d06ccb63d56d04b8adca99f"} Apr 24 16:54:31.510888 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:31.510855 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pqspn_a49b4d14-b188-40e4-828a-7109543078dc/global-pull-secret-syncer/0.log" Apr 24 16:54:31.611808 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:31.611780 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-zfvwh_71ece18d-8c30-46e6-aea9-dad90b2644cb/konnectivity-agent/0.log" Apr 24 16:54:31.654063 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:31.654033 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-83.ec2.internal_d5a1f5c4f174a8faa48510e8386159fc/haproxy/0.log" Apr 24 16:54:35.268289 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.268259 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5gkcs_c769eab2-225b-4974-b9d8-355c809d940c/node-exporter/0.log" Apr 24 16:54:35.292648 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.292614 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5gkcs_c769eab2-225b-4974-b9d8-355c809d940c/kube-rbac-proxy/0.log" Apr 24 16:54:35.322063 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.322037 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5gkcs_c769eab2-225b-4974-b9d8-355c809d940c/init-textfile/0.log" Apr 24 16:54:35.490878 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.490833 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zswgf_1df51550-2fa9-479c-9498-a30481fa319b/kube-rbac-proxy-main/0.log" Apr 24 16:54:35.511611 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.511566 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zswgf_1df51550-2fa9-479c-9498-a30481fa319b/kube-rbac-proxy-self/0.log" Apr 24 16:54:35.534904 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.534814 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-zswgf_1df51550-2fa9-479c-9498-a30481fa319b/openshift-state-metrics/0.log" Apr 24 16:54:35.759693 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.759647 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jbmtl_d37af086-7938-46de-bda3-6b2a30be7321/prometheus-operator/0.log" Apr 24 16:54:35.782073 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.782044 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-jbmtl_d37af086-7938-46de-bda3-6b2a30be7321/kube-rbac-proxy/0.log" Apr 24 16:54:35.916895 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.916821 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/thanos-query/0.log" Apr 24 16:54:35.943879 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.943849 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy-web/0.log" Apr 24 16:54:35.964384 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.964351 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy/0.log" Apr 24 16:54:35.988640 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:35.988611 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/prom-label-proxy/0.log" Apr 24 16:54:36.009776 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:36.009743 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy-rules/0.log" Apr 24 16:54:36.048078 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:36.048051 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-5d9d75b6dd-5nqtk_592c17e3-bd85-4c68-9861-be16b33fb3ce/kube-rbac-proxy-metrics/0.log" Apr 24 16:54:37.163946 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:37.163912 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-qsbsc_796ab223-e545-471e-9868-71174bdad1bf/networking-console-plugin/0.log" Apr 24 16:54:37.973443 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:37.973420 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5f654d599-gvphk_2991c990-e1a8-4649-95a7-e39b3ba6f92a/console/0.log" Apr 24 16:54:38.000519 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.000493 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-zw6kr_e75cf70a-707a-43de-bb15-bdf89460c4ca/download-server/0.log" Apr 24 16:54:38.578259 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.578196 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wz7wm/must-gather-rwx6t" podStartSLOduration=9.730693169 podStartE2EDuration="10.578173622s" podCreationTimestamp="2026-04-24 16:54:28 +0000 UTC" firstStartedPulling="2026-04-24 16:54:29.106148659 +0000 UTC m=+922.096837457" lastFinishedPulling="2026-04-24 16:54:29.953629111 +0000 UTC m=+922.944317910" observedRunningTime="2026-04-24 16:54:30.289489915 +0000 UTC m=+923.280178732" watchObservedRunningTime="2026-04-24 16:54:38.578173622 +0000 UTC m=+931.568862444" Apr 24 16:54:38.579100 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.579082 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th"] Apr 24 16:54:38.583494 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.583469 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.589231 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.589206 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th"] Apr 24 16:54:38.690066 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.690028 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsg4w\" (UniqueName: \"kubernetes.io/projected/c1abd6cf-417b-41da-8d82-f51374ef6203-kube-api-access-dsg4w\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.690066 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.690068 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-proc\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.690295 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.690149 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-podres\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.690295 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.690213 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-lib-modules\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.690295 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.690245 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-sys\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791031 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.790995 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsg4w\" (UniqueName: \"kubernetes.io/projected/c1abd6cf-417b-41da-8d82-f51374ef6203-kube-api-access-dsg4w\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791355 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791327 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-proc\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791453 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-podres\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791497 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791469 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-proc\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791542 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791518 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-lib-modules\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791587 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-sys\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791644 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791621 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-podres\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791689 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-lib-modules\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.791734 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.791689 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c1abd6cf-417b-41da-8d82-f51374ef6203-sys\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.800576 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.800526 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsg4w\" (UniqueName: \"kubernetes.io/projected/c1abd6cf-417b-41da-8d82-f51374ef6203-kube-api-access-dsg4w\") pod \"perf-node-gather-daemonset-2c4th\" (UID: \"c1abd6cf-417b-41da-8d82-f51374ef6203\") " pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:38.897769 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:38.897697 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:39.042547 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.042483 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th"] Apr 24 16:54:39.045862 ip-10-0-137-83 kubenswrapper[2575]: W0424 16:54:39.045835 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc1abd6cf_417b_41da_8d82_f51374ef6203.slice/crio-dfc35ff4ec2bef09760f84307c3dd4fd14b2f8f127ca1cddcad4fd3e94c87e5f WatchSource:0}: Error finding container dfc35ff4ec2bef09760f84307c3dd4fd14b2f8f127ca1cddcad4fd3e94c87e5f: Status 404 returned error can't find the container with id dfc35ff4ec2bef09760f84307c3dd4fd14b2f8f127ca1cddcad4fd3e94c87e5f Apr 24 16:54:39.096029 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.096009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b68bd_25e3ae92-c693-4c50-b5ce-0ed6ad115edd/dns/0.log" Apr 24 16:54:39.115826 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.115809 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-b68bd_25e3ae92-c693-4c50-b5ce-0ed6ad115edd/kube-rbac-proxy/0.log" Apr 24 16:54:39.230945 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.230925 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-49ml9_7f2523fe-21a3-46f7-a03b-88e7ae991338/dns-node-resolver/0.log" Apr 24 16:54:39.304899 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.304849 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" event={"ID":"c1abd6cf-417b-41da-8d82-f51374ef6203","Type":"ContainerStarted","Data":"d0279e54f47c4d09b4c763918f301ad1ea4474650ccdad93122f80b94ae6b5da"} Apr 24 16:54:39.304899 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.304889 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" event={"ID":"c1abd6cf-417b-41da-8d82-f51374ef6203","Type":"ContainerStarted","Data":"dfc35ff4ec2bef09760f84307c3dd4fd14b2f8f127ca1cddcad4fd3e94c87e5f"} Apr 24 16:54:39.305156 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.305021 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:39.322799 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:39.322453 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" podStartSLOduration=1.322435002 podStartE2EDuration="1.322435002s" podCreationTimestamp="2026-04-24 16:54:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 16:54:39.320465472 +0000 UTC m=+932.311154310" watchObservedRunningTime="2026-04-24 16:54:39.322435002 +0000 UTC m=+932.313123820" Apr 24 16:54:45.319028 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:45.318998 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wz7wm/perf-node-gather-daemonset-2c4th" Apr 24 16:54:52.818164 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:52.818133 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b969cbff9-f4cgw_8e30bd49-e02e-4cb9-9908-d3dd4ede132a/registry/1.log" Apr 24 16:54:52.819704 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:52.819686 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-b969cbff9-f4cgw_8e30bd49-e02e-4cb9-9908-d3dd4ede132a/registry/2.log" Apr 24 16:54:53.058737 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:53.058708 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vvwg9_f09d386a-3466-46d1-a1d1-efb87cc77eba/node-ca/0.log" Apr 24 16:54:54.354244 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:54.354211 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5nfqq_b0e3d259-e5e4-4160-8258-8d97913d476a/serve-healthcheck-canary/0.log" Apr 24 16:54:54.949709 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:54.949613 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9sq68_01f900db-1f85-4389-9d11-55baa30ef7c7/kube-rbac-proxy/0.log" Apr 24 16:54:54.973378 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:54.973346 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9sq68_01f900db-1f85-4389-9d11-55baa30ef7c7/exporter/0.log" Apr 24 16:54:54.997723 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:54:54.997645 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-9sq68_01f900db-1f85-4389-9d11-55baa30ef7c7/extractor/0.log" Apr 24 16:55:01.899568 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:55:01.899536 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b9xc2_647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b/kube-multus-additional-cni-plugins/0.log" Apr 24 16:55:01.922324 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:55:01.922295 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b9xc2_647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b/egress-router-binary-copy/0.log" Apr 24 16:55:01.943235 ip-10-0-137-83 kubenswrapper[2575]: I0424 16:55:01.943211 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b9xc2_647cb6b6-c1d7-4f0a-bd40-b5e358e9c90b/cni-plugins/0.log"