Apr 16 08:32:51.486815 ip-10-0-139-144 systemd[1]: Starting Kubernetes Kubelet... Apr 16 08:32:51.965737 ip-10-0-139-144 kubenswrapper[2535]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:32:51.965737 ip-10-0-139-144 kubenswrapper[2535]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 08:32:51.965737 ip-10-0-139-144 kubenswrapper[2535]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:32:51.965737 ip-10-0-139-144 kubenswrapper[2535]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 08:32:51.965737 ip-10-0-139-144 kubenswrapper[2535]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 08:32:51.967428 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.967336 2535 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 08:32:51.971267 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971251 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:32:51.971267 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971267 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971272 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971276 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971280 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971283 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971286 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971289 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971292 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971295 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971298 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971301 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971304 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971306 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971309 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971312 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971315 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971319 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971321 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971324 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:32:51.971330 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971327 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971330 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971333 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971336 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971338 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971341 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971344 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971346 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971351 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971355 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971358 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971361 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971363 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971366 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971369 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971372 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971375 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971378 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971381 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:32:51.971838 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971383 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971386 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971389 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971392 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971394 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971397 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971400 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971402 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971405 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971407 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971410 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971412 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971415 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971417 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971420 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971423 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971426 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971430 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971434 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:32:51.972304 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971437 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971440 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971443 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971445 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971448 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971451 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971454 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971457 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971460 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971464 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971466 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971469 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971472 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971474 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971477 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971479 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971482 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971485 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971498 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971501 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:32:51.972786 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971504 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971506 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971509 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971511 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971514 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971517 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971519 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971522 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971903 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971908 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971911 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971914 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971917 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971920 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971923 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971925 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971928 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971931 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971933 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971936 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:32:51.973259 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971938 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971941 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971944 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971947 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971950 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971953 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971955 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971958 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971960 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971963 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971966 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971969 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971971 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971974 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971976 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971979 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971981 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971984 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971986 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971989 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:32:51.973751 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971992 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971995 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.971997 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972000 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972003 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972005 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972008 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972011 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972014 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972017 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972019 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972022 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972024 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972027 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972029 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972032 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972034 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972037 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972039 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972042 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:32:51.974278 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972045 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972047 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972050 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972052 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972055 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972060 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972065 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972068 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972071 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972074 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972077 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972080 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972082 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972085 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972088 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972092 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972095 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972097 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972102 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:32:51.974847 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972105 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972108 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972111 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972113 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972116 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972118 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972121 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972123 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972126 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972128 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972131 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972133 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972136 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972138 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.972141 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973693 2535 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973702 2535 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973715 2535 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973720 2535 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973725 2535 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973728 2535 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 08:32:51.975375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973732 2535 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973737 2535 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973740 2535 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973743 2535 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973747 2535 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973750 2535 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973753 2535 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973757 2535 flags.go:64] FLAG: --cgroup-root="" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973760 2535 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973763 2535 flags.go:64] FLAG: --client-ca-file="" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973766 2535 flags.go:64] FLAG: --cloud-config="" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973769 2535 flags.go:64] FLAG: --cloud-provider="external" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973773 2535 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973777 2535 flags.go:64] FLAG: --cluster-domain="" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973780 2535 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973784 2535 flags.go:64] FLAG: --config-dir="" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973786 2535 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973790 2535 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973794 2535 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973797 2535 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973800 2535 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973803 2535 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973806 2535 flags.go:64] FLAG: --contention-profiling="false" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973809 2535 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 08:32:51.975980 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973812 2535 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973815 2535 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973818 2535 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973823 2535 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973826 2535 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973829 2535 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973832 2535 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973835 2535 flags.go:64] FLAG: --enable-server="true" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973838 2535 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973844 2535 flags.go:64] FLAG: --event-burst="100" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973847 2535 flags.go:64] FLAG: --event-qps="50" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973850 2535 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973853 2535 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973856 2535 flags.go:64] FLAG: --eviction-hard="" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973860 2535 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973863 2535 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973866 2535 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973869 2535 flags.go:64] FLAG: --eviction-soft="" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973871 2535 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973874 2535 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973878 2535 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973881 2535 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973884 2535 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973887 2535 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973890 2535 flags.go:64] FLAG: --feature-gates="" Apr 16 08:32:51.976749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973894 2535 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973897 2535 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973901 2535 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973904 2535 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973907 2535 flags.go:64] FLAG: --healthz-port="10248" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973910 2535 flags.go:64] FLAG: --help="false" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973913 2535 flags.go:64] FLAG: --hostname-override="ip-10-0-139-144.ec2.internal" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973916 2535 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973919 2535 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973923 2535 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973926 2535 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973930 2535 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973933 2535 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973936 2535 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973939 2535 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973942 2535 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973945 2535 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973948 2535 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973951 2535 flags.go:64] FLAG: --kube-reserved="" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973954 2535 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973957 2535 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973960 2535 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973963 2535 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973966 2535 flags.go:64] FLAG: --lock-file="" Apr 16 08:32:51.977462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973969 2535 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973971 2535 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973974 2535 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973980 2535 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973983 2535 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973986 2535 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973988 2535 flags.go:64] FLAG: --logging-format="text" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973991 2535 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973995 2535 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.973997 2535 flags.go:64] FLAG: --manifest-url="" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974000 2535 flags.go:64] FLAG: --manifest-url-header="" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974005 2535 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974008 2535 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974012 2535 flags.go:64] FLAG: --max-pods="110" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974015 2535 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974018 2535 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974021 2535 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974024 2535 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974027 2535 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974030 2535 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974033 2535 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974041 2535 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974044 2535 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974048 2535 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 08:32:51.978062 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974051 2535 flags.go:64] FLAG: --pod-cidr="" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974054 2535 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974060 2535 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974063 2535 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974067 2535 flags.go:64] FLAG: --pods-per-core="0" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974070 2535 flags.go:64] FLAG: --port="10250" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974073 2535 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974076 2535 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-031b85b1284afa65f" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974079 2535 flags.go:64] FLAG: --qos-reserved="" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974082 2535 flags.go:64] FLAG: --read-only-port="10255" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974085 2535 flags.go:64] FLAG: --register-node="true" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974089 2535 flags.go:64] FLAG: --register-schedulable="true" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974092 2535 flags.go:64] FLAG: --register-with-taints="" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974095 2535 flags.go:64] FLAG: --registry-burst="10" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974098 2535 flags.go:64] FLAG: --registry-qps="5" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974101 2535 flags.go:64] FLAG: --reserved-cpus="" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974104 2535 flags.go:64] FLAG: --reserved-memory="" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974107 2535 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974110 2535 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974114 2535 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974116 2535 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974119 2535 flags.go:64] FLAG: --runonce="false" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974122 2535 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974125 2535 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974128 2535 flags.go:64] FLAG: --seccomp-default="false" Apr 16 08:32:51.978671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974131 2535 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974134 2535 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974137 2535 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974140 2535 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974143 2535 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974146 2535 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974149 2535 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974152 2535 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974155 2535 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974158 2535 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974161 2535 flags.go:64] FLAG: --system-cgroups="" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974163 2535 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974169 2535 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974172 2535 flags.go:64] FLAG: --tls-cert-file="" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974174 2535 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974178 2535 flags.go:64] FLAG: --tls-min-version="" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974181 2535 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974184 2535 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974189 2535 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974192 2535 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974195 2535 flags.go:64] FLAG: --v="2" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974199 2535 flags.go:64] FLAG: --version="false" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974203 2535 flags.go:64] FLAG: --vmodule="" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974207 2535 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.974210 2535 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 08:32:51.979272 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974306 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974310 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974313 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974316 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974321 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974324 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974327 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974330 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974333 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974335 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974338 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974341 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974343 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974346 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974348 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974351 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974354 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974357 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974359 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:32:51.979889 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974362 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974364 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974367 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974379 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974382 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974385 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974388 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974391 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974394 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974397 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974399 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974402 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974404 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974407 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974410 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974412 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974417 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974419 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974422 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974424 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:32:51.980421 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974427 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974431 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974434 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974436 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974439 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974442 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974444 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974447 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974449 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974453 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974456 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974458 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974461 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974464 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974466 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974469 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974472 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974474 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974476 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974479 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:32:51.980942 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974482 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974484 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974504 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974508 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974512 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974517 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974520 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974522 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974526 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974529 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974532 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974535 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974538 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974542 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974544 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974547 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974550 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974552 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974555 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:32:51.981727 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974557 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974560 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974563 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974566 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974569 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974572 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974574 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.974577 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:32:51.982446 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.975338 2535 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.982523 2535 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.982540 2535 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982591 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982596 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982600 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982603 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982606 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982609 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982612 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982614 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982617 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982620 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982622 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982625 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982627 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982630 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982632 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982635 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982638 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:32:51.982716 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982641 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982645 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982648 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982652 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982654 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982657 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982661 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982663 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982666 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982669 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982672 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982675 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982677 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982680 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982683 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982686 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982688 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982691 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982694 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:32:51.983211 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982696 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982699 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982702 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982704 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982707 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982711 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982716 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982719 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982722 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982725 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982728 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982731 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982733 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982736 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982739 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982741 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982744 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982747 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982749 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:32:51.983706 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982752 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982755 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982757 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982760 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982762 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982765 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982768 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982770 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982773 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982775 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982778 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982781 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982783 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982786 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982788 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982791 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982793 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982796 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982798 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:32:51.984210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982801 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982804 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982807 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982810 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982812 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982815 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982818 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982820 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982822 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982825 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982828 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982830 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.982835 2535 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982937 2535 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982942 2535 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982945 2535 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 08:32:51.984700 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982948 2535 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982951 2535 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982953 2535 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982956 2535 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982959 2535 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982962 2535 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982964 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982967 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982969 2535 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982972 2535 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982974 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982977 2535 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982979 2535 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982982 2535 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982984 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982987 2535 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982989 2535 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982992 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982994 2535 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.982998 2535 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 08:32:51.985101 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983001 2535 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983003 2535 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983005 2535 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983008 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983010 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983013 2535 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983015 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983018 2535 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983020 2535 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983023 2535 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983026 2535 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983028 2535 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983031 2535 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983033 2535 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983036 2535 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983038 2535 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983041 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983044 2535 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983046 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983049 2535 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 08:32:51.985615 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983052 2535 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983057 2535 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983059 2535 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983062 2535 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983065 2535 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983067 2535 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983070 2535 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983073 2535 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983075 2535 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983078 2535 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983080 2535 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983083 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983087 2535 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983091 2535 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983094 2535 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983097 2535 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983100 2535 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983103 2535 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983105 2535 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 08:32:51.986119 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983108 2535 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983110 2535 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983113 2535 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983116 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983118 2535 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983121 2535 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983123 2535 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983126 2535 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983129 2535 feature_gate.go:328] unrecognized feature gate: Example Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983131 2535 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983133 2535 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983136 2535 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983138 2535 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983141 2535 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983143 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983146 2535 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983148 2535 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983151 2535 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983153 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983156 2535 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 08:32:51.986643 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983158 2535 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 08:32:51.987120 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983160 2535 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 08:32:51.987120 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983163 2535 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 08:32:51.987120 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:51.983165 2535 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 08:32:51.987120 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.983170 2535 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 08:32:51.987120 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.983928 2535 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 08:32:51.987120 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.986051 2535 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 08:32:51.987285 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.987191 2535 server.go:1019] "Starting client certificate rotation" Apr 16 08:32:51.987327 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.987306 2535 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:32:51.987376 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:51.987364 2535 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 08:32:52.016096 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.016070 2535 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:32:52.018583 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.018560 2535 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 08:32:52.054599 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.054573 2535 log.go:25] "Validated CRI v1 runtime API" Apr 16 08:32:52.061521 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.061502 2535 log.go:25] "Validated CRI v1 image API" Apr 16 08:32:52.062813 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.062801 2535 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 08:32:52.065206 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.065185 2535 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c627f49a-480f-4dc3-9fae-40e8b419051f:/dev/nvme0n1p4 c731922e-5bbc-4707-bbeb-c54f9d318bb6:/dev/nvme0n1p3] Apr 16 08:32:52.065277 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.065205 2535 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 08:32:52.072406 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.072297 2535 manager.go:217] Machine: {Timestamp:2026-04-16 08:32:52.070077899 +0000 UTC m=+0.460932121 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099655 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec20bd64bd84e960bffa9099dea2be37 SystemUUID:ec20bd64-bd84-e960-bffa-9099dea2be37 BootID:8db7950e-19d9-4852-8167-ed8b284fdd40 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:40:65:e6:8d:c1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:40:65:e6:8d:c1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1e:c4:89:1e:a7:94 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 08:32:52.072406 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.072394 2535 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 08:32:52.072540 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.072473 2535 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:32:52.072540 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.072480 2535 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 08:32:52.075837 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.075813 2535 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 08:32:52.075977 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.075846 2535 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 08:32:52.076030 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.075987 2535 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 08:32:52.076030 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.075995 2535 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 08:32:52.076030 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.076008 2535 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:32:52.076955 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.076944 2535 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 08:32:52.078518 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.078507 2535 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:32:52.078639 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.078630 2535 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 08:32:52.081464 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.081171 2535 kubelet.go:491] "Attempting to sync node with API server" Apr 16 08:32:52.081464 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.081182 2535 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 08:32:52.081464 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.081194 2535 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 08:32:52.081464 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.081203 2535 kubelet.go:397] "Adding apiserver pod source" Apr 16 08:32:52.081464 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.081212 2535 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 08:32:52.082428 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.082414 2535 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:32:52.082504 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.082439 2535 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 08:32:52.085934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.085916 2535 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 08:32:52.087785 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.087772 2535 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 08:32:52.089158 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089146 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089162 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089168 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089173 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089179 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089184 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089190 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089195 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089202 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089208 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089216 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 08:32:52.089219 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.089224 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 08:32:52.090199 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.090188 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 08:32:52.090199 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.090199 2535 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 08:32:52.092205 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.092171 2535 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 08:32:52.092275 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.092201 2535 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 08:32:52.094271 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.094257 2535 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 08:32:52.094332 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.094293 2535 server.go:1295] "Started kubelet" Apr 16 08:32:52.094408 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.094383 2535 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 08:32:52.094478 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.094438 2535 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 08:32:52.094551 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.094526 2535 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 08:32:52.095042 ip-10-0-139-144 systemd[1]: Started Kubernetes Kubelet. Apr 16 08:32:52.095756 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.095611 2535 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 08:32:52.097666 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.097651 2535 server.go:317] "Adding debug handlers to kubelet server" Apr 16 08:32:52.101411 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.101390 2535 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 08:32:52.102000 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.101983 2535 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 08:32:52.103311 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103034 2535 factory.go:55] Registering systemd factory Apr 16 08:32:52.103311 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103053 2535 factory.go:223] Registration of the systemd container factory successfully Apr 16 08:32:52.103311 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103036 2535 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 08:32:52.103311 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.103188 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.103558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103318 2535 factory.go:153] Registering CRI-O factory Apr 16 08:32:52.103558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103331 2535 factory.go:223] Registration of the crio container factory successfully Apr 16 08:32:52.103558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103403 2535 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 08:32:52.103558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103428 2535 factory.go:103] Registering Raw factory Apr 16 08:32:52.103558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103442 2535 manager.go:1196] Started watching for new ooms in manager Apr 16 08:32:52.103783 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103643 2535 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 08:32:52.103783 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103665 2535 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 08:32:52.103783 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103779 2535 reconstruct.go:97] "Volume reconstruction finished" Apr 16 08:32:52.103783 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.103785 2535 reconciler.go:26] "Reconciler: start to sync state" Apr 16 08:32:52.104117 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.104099 2535 manager.go:319] Starting recovery of all containers Apr 16 08:32:52.104682 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.104657 2535 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 08:32:52.104990 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.104964 2535 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-139-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 08:32:52.108795 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.108773 2535 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 08:32:52.108888 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.104819 2535 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-144.ec2.internal.18a6c93e3720776b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-144.ec2.internal,UID:ip-10-0-139-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-139-144.ec2.internal,},FirstTimestamp:2026-04-16 08:32:52.094269291 +0000 UTC m=+0.485123513,LastTimestamp:2026-04-16 08:32:52.094269291 +0000 UTC m=+0.485123513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-144.ec2.internal,}" Apr 16 08:32:52.111697 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.111660 2535 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 08:32:52.119745 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.119642 2535 manager.go:324] Recovery completed Apr 16 08:32:52.122050 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.122034 2535 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7r5s8" Apr 16 08:32:52.123703 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.123692 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:32:52.126101 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.126085 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:32:52.126172 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.126116 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:32:52.126172 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.126128 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:32:52.126657 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.126641 2535 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 08:32:52.126657 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.126656 2535 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 08:32:52.126770 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.126674 2535 state_mem.go:36] "Initialized new in-memory state store" Apr 16 08:32:52.127889 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.127803 2535 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-139-144.ec2.internal.18a6c93e390626a7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-139-144.ec2.internal,UID:ip-10-0-139-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-139-144.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-139-144.ec2.internal,},FirstTimestamp:2026-04-16 08:32:52.126099111 +0000 UTC m=+0.516953333,LastTimestamp:2026-04-16 08:32:52.126099111 +0000 UTC m=+0.516953333,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-139-144.ec2.internal,}" Apr 16 08:32:52.129251 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.129236 2535 policy_none.go:49] "None policy: Start" Apr 16 08:32:52.129328 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.129256 2535 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 08:32:52.129328 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.129269 2535 state_mem.go:35] "Initializing new in-memory state store" Apr 16 08:32:52.129419 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.129397 2535 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-7r5s8" Apr 16 08:32:52.179071 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.179057 2535 manager.go:341] "Starting Device Plugin manager" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.179086 2535 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.179096 2535 server.go:85] "Starting device plugin registration server" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.179323 2535 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.179335 2535 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.179432 2535 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.179519 2535 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.179528 2535 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.179990 2535 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 08:32:52.184913 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.180025 2535 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.251983 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.251921 2535 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 08:32:52.253053 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.253037 2535 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 08:32:52.253133 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.253063 2535 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 08:32:52.253133 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.253081 2535 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 08:32:52.253133 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.253088 2535 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 08:32:52.253133 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.253120 2535 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 08:32:52.255688 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.255669 2535 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:32:52.279699 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.279683 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:32:52.280957 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.280942 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:32:52.281011 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.280971 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:32:52.281011 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.280984 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:32:52.281011 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.281009 2535 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.290888 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.290871 2535 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.290939 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.290890 2535 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-144.ec2.internal\": node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.353707 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.353684 2535 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal"] Apr 16 08:32:52.353796 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.353757 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:32:52.354587 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.354571 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:32:52.354658 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.354598 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:32:52.354658 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.354608 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:32:52.355871 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.355859 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:32:52.356030 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356015 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.356107 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356047 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:32:52.356592 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356579 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:32:52.356644 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356607 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:32:52.356644 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356618 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:32:52.356702 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356579 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:32:52.356702 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356676 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:32:52.356702 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.356690 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:32:52.358324 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.358307 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.358404 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.358340 2535 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 08:32:52.359059 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.359045 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientMemory" Apr 16 08:32:52.359131 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.359069 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 08:32:52.359131 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.359085 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeHasSufficientPID" Apr 16 08:32:52.365079 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.365065 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.386057 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.386039 2535 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-144.ec2.internal\" not found" node="ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.390354 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.390329 2535 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-144.ec2.internal\" not found" node="ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.405666 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.405645 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/118bda75cde6d644622189268cb66453-config\") pod \"kube-apiserver-proxy-ip-10-0-139-144.ec2.internal\" (UID: \"118bda75cde6d644622189268cb66453\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.405749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.405675 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/73b2365eabd261842813e22728dbf0cc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal\" (UID: \"73b2365eabd261842813e22728dbf0cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.405749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.405704 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73b2365eabd261842813e22728dbf0cc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal\" (UID: \"73b2365eabd261842813e22728dbf0cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.465480 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.465452 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.506653 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.506586 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/118bda75cde6d644622189268cb66453-config\") pod \"kube-apiserver-proxy-ip-10-0-139-144.ec2.internal\" (UID: \"118bda75cde6d644622189268cb66453\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.506653 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.506619 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/73b2365eabd261842813e22728dbf0cc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal\" (UID: \"73b2365eabd261842813e22728dbf0cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.506653 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.506644 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73b2365eabd261842813e22728dbf0cc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal\" (UID: \"73b2365eabd261842813e22728dbf0cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.506817 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.506656 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/118bda75cde6d644622189268cb66453-config\") pod \"kube-apiserver-proxy-ip-10-0-139-144.ec2.internal\" (UID: \"118bda75cde6d644622189268cb66453\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.506817 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.506677 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/73b2365eabd261842813e22728dbf0cc-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal\" (UID: \"73b2365eabd261842813e22728dbf0cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.506817 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.506709 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/73b2365eabd261842813e22728dbf0cc-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal\" (UID: \"73b2365eabd261842813e22728dbf0cc\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.565848 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.565823 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.666446 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.666408 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.689611 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.689593 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.693195 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.693179 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:52.767468 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.767373 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.867929 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:52.867895 2535 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-144.ec2.internal\" not found" Apr 16 08:32:52.914076 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.914057 2535 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:32:52.987903 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.987879 2535 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 08:32:52.988319 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.987991 2535 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:32:52.988319 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:52.988032 2535 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 08:32:53.003169 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.003147 2535 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" Apr 16 08:32:53.023159 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.023103 2535 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:32:53.024865 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.024851 2535 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" Apr 16 08:32:53.037567 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.037554 2535 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 08:32:53.081543 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.081527 2535 apiserver.go:52] "Watching apiserver" Apr 16 08:32:53.088231 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.088201 2535 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 08:32:53.088543 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.088521 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb","openshift-cluster-node-tuning-operator/tuned-gxcvx","openshift-dns/node-resolver-clvxz","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal","openshift-multus/multus-t9jnz","openshift-ovn-kubernetes/ovnkube-node-2b6j8","kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal","openshift-image-registry/node-ca-c4ztq","openshift-multus/multus-additional-cni-plugins-g4m4l","openshift-multus/network-metrics-daemon-88lvl","openshift-network-diagnostics/network-check-target-bfsfg","openshift-network-operator/iptables-alerter-p2z7c","kube-system/konnectivity-agent-v8qrs"] Apr 16 08:32:53.091500 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.091470 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.092528 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.092507 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.092617 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.092568 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.093624 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.093605 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.094044 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.094027 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 08:32:53.094108 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.094092 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:32:53.094326 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.094312 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jk5tm\"" Apr 16 08:32:53.094404 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.094386 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 08:32:53.094741 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.094726 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.095231 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095216 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 08:32:53.095311 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095219 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-d6lsm\"" Apr 16 08:32:53.095311 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095222 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:32:53.095473 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095461 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 08:32:53.095556 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095530 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 08:32:53.095556 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095551 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 08:32:53.095653 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095596 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-j754g\"" Apr 16 08:32:53.095829 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095799 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 08:32:53.095829 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095829 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 08:32:53.096014 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.095975 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-j6zpl\"" Apr 16 08:32:53.096014 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.096010 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.096915 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.096873 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qtgdf\"" Apr 16 08:32:53.096915 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.096891 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 08:32:53.096915 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.096909 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 08:32:53.097120 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.096951 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 08:32:53.097186 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.097135 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 08:32:53.097240 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.097189 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.097930 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.097915 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 08:32:53.098077 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.098051 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 08:32:53.098077 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.098066 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sccxv\"" Apr 16 08:32:53.098311 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.098297 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 08:32:53.098374 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.098315 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 08:32:53.098432 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.098304 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 08:32:53.098432 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.098417 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.098679 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.098562 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 08:32:53.099341 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.099328 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-q4ckz\"" Apr 16 08:32:53.099378 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.099328 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 08:32:53.099628 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.099590 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.099675 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.099629 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 08:32:53.100365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.100351 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-dffzl\"" Apr 16 08:32:53.100529 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.100517 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 08:32:53.100574 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.100531 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 08:32:53.100574 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.100570 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 08:32:53.100867 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.100854 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:53.100941 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.100925 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:32:53.101538 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.101523 2535 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 08:32:53.101599 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.101586 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 08:32:53.101694 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.101677 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-6bddg\"" Apr 16 08:32:53.101784 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.101769 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 08:32:53.101958 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.101941 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:53.102025 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.101987 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:32:53.104021 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.104005 2535 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 08:32:53.109902 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.109877 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-device-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.109902 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.109900 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-sys-fs\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.110019 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.109915 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-socket-dir-parent\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.110019 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.109930 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-ovn\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.110019 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.109943 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7efe070a-8151-4485-9a0a-23daecd1d21c-serviceca\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.110019 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.109975 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-cnibin\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.110019 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110014 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110061 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-cni-bin\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110096 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-env-overrides\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110122 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ed7597d-e9d4-47fd-acaf-b04d3b412318-host-slash\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110145 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-slash\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110179 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4nq6\" (UniqueName: \"kubernetes.io/projected/72c8255d-1385-43a9-b20b-b7dfd0472d12-kube-api-access-r4nq6\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110206 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-ovnkube-config\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110229 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-kubernetes\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.110257 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110252 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-lib-modules\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110279 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-registration-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110359 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-etc-selinux\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110386 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-daemon-config\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110411 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f-agent-certs\") pod \"konnectivity-agent-v8qrs\" (UID: \"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f\") " pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110434 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72c8255d-1385-43a9-b20b-b7dfd0472d12-hosts-file\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110466 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110504 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-host\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110535 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110558 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7pc7\" (UniqueName: \"kubernetes.io/projected/6ed7597d-e9d4-47fd-acaf-b04d3b412318-kube-api-access-p7pc7\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110580 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-systemd\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110602 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-netns\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110632 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-cni-multus\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110656 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-etc-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.110687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110689 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110710 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysctl-d\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110739 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-conf-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110761 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-kubelet\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110784 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-systemd\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110804 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-os-release\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110836 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-system-cni-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110860 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-hostroot\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110884 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-etc-kubernetes\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110907 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-var-lib-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110933 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd6kv\" (UniqueName: \"kubernetes.io/projected/542120ec-6600-4198-bcc2-69755cfcf1d4-kube-api-access-kd6kv\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.110957 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f-konnectivity-ca\") pod \"konnectivity-agent-v8qrs\" (UID: \"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f\") " pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.111240 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2zq2\" (UniqueName: \"kubernetes.io/projected/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-kube-api-access-t2zq2\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:53.111313 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.111303 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-var-lib-kubelet\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.112054 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.111334 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112054 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.111368 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-cni-bin\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112054 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.111402 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.112054 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.111432 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysctl-conf\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.112054 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.111461 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-tuned\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.112247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112066 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tdp\" (UniqueName: \"kubernetes.io/projected/bb5bb30d-eff2-430a-b4d1-c40e534c027f-kube-api-access-g2tdp\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112111 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-ovnkube-script-lib\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112140 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112174 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-modprobe-d\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.112247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112198 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-tmp\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.112247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112229 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-socket-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112257 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-cni-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112286 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-systemd-units\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112314 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-multus-certs\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112335 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-node-log\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112362 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpfqq\" (UniqueName: \"kubernetes.io/projected/7efe070a-8151-4485-9a0a-23daecd1d21c-kube-api-access-lpfqq\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112391 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-system-cni-dir\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112418 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klmkx\" (UniqueName: \"kubernetes.io/projected/c79b1bf6-7559-4369-a195-27e8876dde6f-kube-api-access-klmkx\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.112459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112446 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72c8255d-1385-43a9-b20b-b7dfd0472d12-tmp-dir\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112467 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-os-release\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112517 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb5bb30d-eff2-430a-b4d1-c40e534c027f-cni-binary-copy\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112572 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-k8s-cni-cncf-io\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112682 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-kubelet\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112717 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-run-netns\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112747 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6ed7597d-e9d4-47fd-acaf-b04d3b412318-iptables-alerter-script\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112790 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62gvc\" (UniqueName: \"kubernetes.io/projected/940b59dd-2162-4388-aea6-b4ba3b7aab77-kube-api-access-62gvc\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112815 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-cnibin\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112845 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.112877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.112874 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542120ec-6600-4198-bcc2-69755cfcf1d4-ovn-node-metrics-cert\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113055 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7efe070a-8151-4485-9a0a-23daecd1d21c-host\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113084 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgj7g\" (UniqueName: \"kubernetes.io/projected/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-kube-api-access-vgj7g\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113114 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-log-socket\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113137 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113171 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113206 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysconfig\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113247 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-run\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113294 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-sys\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.113365 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.113325 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-cni-netd\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.115155 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.115133 2535 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 08:32:53.131423 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.131396 2535 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 08:27:52 +0000 UTC" deadline="2027-10-29 17:50:59.746400934 +0000 UTC" Apr 16 08:32:53.131423 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.131418 2535 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13473h18m6.61498473s" Apr 16 08:32:53.132280 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.132265 2535 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mpgz6" Apr 16 08:32:53.135859 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.135840 2535 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:32:53.141264 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.141240 2535 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mpgz6" Apr 16 08:32:53.213897 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.213877 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-kubernetes\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.213903 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-lib-modules\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.213922 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-registration-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.213944 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-etc-selinux\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.213983 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-kubernetes\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214003 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-registration-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214021 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-daemon-config\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214033 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-etc-selinux\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.214040 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214038 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-lib-modules\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214047 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f-agent-certs\") pod \"konnectivity-agent-v8qrs\" (UID: \"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f\") " pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214070 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72c8255d-1385-43a9-b20b-b7dfd0472d12-hosts-file\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214092 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214114 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-host\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214116 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72c8255d-1385-43a9-b20b-b7dfd0472d12-hosts-file\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214144 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-host\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214138 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214174 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-kubelet-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214190 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p7pc7\" (UniqueName: \"kubernetes.io/projected/6ed7597d-e9d4-47fd-acaf-b04d3b412318-kube-api-access-p7pc7\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.214196 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214215 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-systemd\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214247 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-netns\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214280 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-netns\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.214283 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:32:53.714252294 +0000 UTC m=+2.105106535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214298 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-systemd\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214325 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-cni-multus\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.214368 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214358 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-etc-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214384 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214395 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-cni-multus\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214406 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysctl-d\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214437 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-etc-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214469 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-conf-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214507 2535 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214547 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-kubelet\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214547 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysctl-d\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214586 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-conf-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214513 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-kubelet\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214602 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-daemon-config\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214617 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-systemd\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214642 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-os-release\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214667 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-system-cni-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214691 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-hostroot\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214694 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-systemd\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214721 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-etc-kubernetes\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.215214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214734 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-hostroot\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214737 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-system-cni-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214722 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-os-release\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214749 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-var-lib-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214783 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd6kv\" (UniqueName: \"kubernetes.io/projected/542120ec-6600-4198-bcc2-69755cfcf1d4-kube-api-access-kd6kv\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214757 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-etc-kubernetes\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214807 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f-konnectivity-ca\") pod \"konnectivity-agent-v8qrs\" (UID: \"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f\") " pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214784 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-var-lib-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214834 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2zq2\" (UniqueName: \"kubernetes.io/projected/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-kube-api-access-t2zq2\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214877 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-var-lib-kubelet\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214900 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214919 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-cni-bin\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214939 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214958 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysctl-conf\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214974 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-run-ovn-kubernetes\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214978 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-tuned\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.214998 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-cni-bin\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216006 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215016 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tdp\" (UniqueName: \"kubernetes.io/projected/bb5bb30d-eff2-430a-b4d1-c40e534c027f-kube-api-access-g2tdp\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215030 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-var-lib-kubelet\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215035 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215038 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-ovnkube-script-lib\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215083 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215110 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-modprobe-d\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215120 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysctl-conf\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215134 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-tmp\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215178 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-openvswitch\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215260 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-modprobe-d\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215295 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-socket-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215320 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-cni-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215343 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-systemd-units\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215368 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-multus-certs\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215389 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f-konnectivity-ca\") pod \"konnectivity-agent-v8qrs\" (UID: \"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f\") " pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215392 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-node-log\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215421 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-node-log\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.216854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215433 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lpfqq\" (UniqueName: \"kubernetes.io/projected/7efe070a-8151-4485-9a0a-23daecd1d21c-kube-api-access-lpfqq\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215459 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-system-cni-dir\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215505 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-klmkx\" (UniqueName: \"kubernetes.io/projected/c79b1bf6-7559-4369-a195-27e8876dde6f-kube-api-access-klmkx\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215528 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72c8255d-1385-43a9-b20b-b7dfd0472d12-tmp-dir\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215550 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-os-release\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215573 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb5bb30d-eff2-430a-b4d1-c40e534c027f-cni-binary-copy\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215593 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-cni-binary-copy\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215601 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-socket-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215621 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-ovnkube-script-lib\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215628 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-k8s-cni-cncf-io\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215598 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-k8s-cni-cncf-io\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215663 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-systemd-units\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215665 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-kubelet\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215695 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-kubelet\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215697 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-run-netns\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215726 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-run-netns\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215754 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6ed7597d-e9d4-47fd-acaf-b04d3b412318-iptables-alerter-script\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.217655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215773 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-run-multus-certs\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215789 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62gvc\" (UniqueName: \"kubernetes.io/projected/940b59dd-2162-4388-aea6-b4ba3b7aab77-kube-api-access-62gvc\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215821 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-cnibin\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215854 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215915 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542120ec-6600-4198-bcc2-69755cfcf1d4-ovn-node-metrics-cert\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215944 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7efe070a-8151-4485-9a0a-23daecd1d21c-host\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215992 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vgj7g\" (UniqueName: \"kubernetes.io/projected/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-kube-api-access-vgj7g\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216026 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-log-socket\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216058 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216087 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216121 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysconfig\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216151 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-run\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.215654 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-cni-dir\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216180 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-sys\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216205 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-cni-netd\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216230 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-device-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216251 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bb5bb30d-eff2-430a-b4d1-c40e534c027f-cni-binary-copy\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.218293 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216257 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-sys-fs\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216295 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-sys-fs\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216304 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-socket-dir-parent\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216334 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-ovn\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216359 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7efe070a-8151-4485-9a0a-23daecd1d21c-serviceca\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216384 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-cnibin\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216388 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216411 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216417 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-log-socket\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216671 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-os-release\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216720 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-multus-socket-dir-parent\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216743 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-system-cni-dir\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216756 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-run-ovn\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216798 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-sysconfig\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216841 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-run\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216878 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-cnibin\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216359 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-cnibin\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216206 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7efe070a-8151-4485-9a0a-23daecd1d21c-host\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.219082 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216900 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72c8255d-1385-43a9-b20b-b7dfd0472d12-tmp-dir\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216938 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-cni-netd\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216983 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/940b59dd-2162-4388-aea6-b4ba3b7aab77-device-dir\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.216989 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-sys\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217106 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-cni-bin\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217162 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-env-overrides\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217187 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ed7597d-e9d4-47fd-acaf-b04d3b412318-host-slash\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217234 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-slash\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217251 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7efe070a-8151-4485-9a0a-23daecd1d21c-serviceca\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217258 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4nq6\" (UniqueName: \"kubernetes.io/projected/72c8255d-1385-43a9-b20b-b7dfd0472d12-kube-api-access-r4nq6\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217302 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-ovnkube-config\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217453 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6ed7597d-e9d4-47fd-acaf-b04d3b412318-iptables-alerter-script\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217184 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c79b1bf6-7559-4369-a195-27e8876dde6f-tuning-conf-dir\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217482 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6ed7597d-e9d4-47fd-acaf-b04d3b412318-host-slash\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217534 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bb5bb30d-eff2-430a-b4d1-c40e534c027f-host-var-lib-cni-bin\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217546 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/542120ec-6600-4198-bcc2-69755cfcf1d4-host-slash\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217729 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-env-overrides\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.219716 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217747 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c79b1bf6-7559-4369-a195-27e8876dde6f-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.220162 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217819 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/542120ec-6600-4198-bcc2-69755cfcf1d4-ovnkube-config\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.220162 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.217989 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-etc-tuned\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.220162 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.218043 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-tmp\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.220162 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.218245 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f-agent-certs\") pod \"konnectivity-agent-v8qrs\" (UID: \"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f\") " pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.220162 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.219153 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/542120ec-6600-4198-bcc2-69755cfcf1d4-ovn-node-metrics-cert\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.226383 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.224068 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:32:53.226383 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.224100 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:32:53.226383 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.224114 2535 projected.go:194] Error preparing data for projected volume kube-api-access-dpd7c for pod openshift-network-diagnostics/network-check-target-bfsfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:53.226383 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.224185 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c podName:3f8fae38-02fe-4a19-b915-1b456238b4eb nodeName:}" failed. No retries permitted until 2026-04-16 08:32:53.724165833 +0000 UTC m=+2.115020063 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-dpd7c" (UniqueName: "kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c") pod "network-check-target-bfsfg" (UID: "3f8fae38-02fe-4a19-b915-1b456238b4eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:53.226383 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.226246 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpfqq\" (UniqueName: \"kubernetes.io/projected/7efe070a-8151-4485-9a0a-23daecd1d21c-kube-api-access-lpfqq\") pod \"node-ca-c4ztq\" (UID: \"7efe070a-8151-4485-9a0a-23daecd1d21c\") " pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.228288 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.228231 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-klmkx\" (UniqueName: \"kubernetes.io/projected/c79b1bf6-7559-4369-a195-27e8876dde6f-kube-api-access-klmkx\") pod \"multus-additional-cni-plugins-g4m4l\" (UID: \"c79b1bf6-7559-4369-a195-27e8876dde6f\") " pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.228384 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.228316 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62gvc\" (UniqueName: \"kubernetes.io/projected/940b59dd-2162-4388-aea6-b4ba3b7aab77-kube-api-access-62gvc\") pod \"aws-ebs-csi-driver-node-jtkwb\" (UID: \"940b59dd-2162-4388-aea6-b4ba3b7aab77\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.228543 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.228516 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4nq6\" (UniqueName: \"kubernetes.io/projected/72c8255d-1385-43a9-b20b-b7dfd0472d12-kube-api-access-r4nq6\") pod \"node-resolver-clvxz\" (UID: \"72c8255d-1385-43a9-b20b-b7dfd0472d12\") " pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.228655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.228633 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tdp\" (UniqueName: \"kubernetes.io/projected/bb5bb30d-eff2-430a-b4d1-c40e534c027f-kube-api-access-g2tdp\") pod \"multus-t9jnz\" (UID: \"bb5bb30d-eff2-430a-b4d1-c40e534c027f\") " pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.228894 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.228874 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2zq2\" (UniqueName: \"kubernetes.io/projected/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-kube-api-access-t2zq2\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:53.229177 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.229161 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgj7g\" (UniqueName: \"kubernetes.io/projected/cd8eb931-b262-4a3d-a87a-1d92a4f2ac91-kube-api-access-vgj7g\") pod \"tuned-gxcvx\" (UID: \"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91\") " pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.229430 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.229413 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd6kv\" (UniqueName: \"kubernetes.io/projected/542120ec-6600-4198-bcc2-69755cfcf1d4-kube-api-access-kd6kv\") pod \"ovnkube-node-2b6j8\" (UID: \"542120ec-6600-4198-bcc2-69755cfcf1d4\") " pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.229973 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.229955 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7pc7\" (UniqueName: \"kubernetes.io/projected/6ed7597d-e9d4-47fd-acaf-b04d3b412318-kube-api-access-p7pc7\") pod \"iptables-alerter-p2z7c\" (UID: \"6ed7597d-e9d4-47fd-acaf-b04d3b412318\") " pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.238742 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.238726 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:32:53.253161 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.253145 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-c4ztq" Apr 16 08:32:53.260144 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.260131 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" Apr 16 08:32:53.267067 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.267042 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod118bda75cde6d644622189268cb66453.slice/crio-7a2bc28b932cd30fc285d864781c3243a062d18ed59b29ca9c367b048f232947 WatchSource:0}: Error finding container 7a2bc28b932cd30fc285d864781c3243a062d18ed59b29ca9c367b048f232947: Status 404 returned error can't find the container with id 7a2bc28b932cd30fc285d864781c3243a062d18ed59b29ca9c367b048f232947 Apr 16 08:32:53.267554 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.267535 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73b2365eabd261842813e22728dbf0cc.slice/crio-6810e9c53dae26888ed26d6877cde0fa0f97527f33d696da4f6634511980c9c5 WatchSource:0}: Error finding container 6810e9c53dae26888ed26d6877cde0fa0f97527f33d696da4f6634511980c9c5: Status 404 returned error can't find the container with id 6810e9c53dae26888ed26d6877cde0fa0f97527f33d696da4f6634511980c9c5 Apr 16 08:32:53.268945 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.268911 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7efe070a_8151_4485_9a0a_23daecd1d21c.slice/crio-ff3653bed783661f930e6d4b549961a1fdba416b1b423a0d153277fc6f3f704e WatchSource:0}: Error finding container ff3653bed783661f930e6d4b549961a1fdba416b1b423a0d153277fc6f3f704e: Status 404 returned error can't find the container with id ff3653bed783661f930e6d4b549961a1fdba416b1b423a0d153277fc6f3f704e Apr 16 08:32:53.269622 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.269600 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79b1bf6_7559_4369_a195_27e8876dde6f.slice/crio-5d39213190c9812639f1523043cdef581a9001d57d07d140fb851b8b2068fe37 WatchSource:0}: Error finding container 5d39213190c9812639f1523043cdef581a9001d57d07d140fb851b8b2068fe37: Status 404 returned error can't find the container with id 5d39213190c9812639f1523043cdef581a9001d57d07d140fb851b8b2068fe37 Apr 16 08:32:53.273632 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.273619 2535 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:32:53.419838 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.419812 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p2z7c" Apr 16 08:32:53.427015 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.426992 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ed7597d_e9d4_47fd_acaf_b04d3b412318.slice/crio-4dc44c9cb87335394823043909a6c69d688043ecadfbc6e30cb6aefb47499203 WatchSource:0}: Error finding container 4dc44c9cb87335394823043909a6c69d688043ecadfbc6e30cb6aefb47499203: Status 404 returned error can't find the container with id 4dc44c9cb87335394823043909a6c69d688043ecadfbc6e30cb6aefb47499203 Apr 16 08:32:53.438875 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.438843 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" Apr 16 08:32:53.445234 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.445214 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd8eb931_b262_4a3d_a87a_1d92a4f2ac91.slice/crio-cc686e2b41cf9f2d4f0485784a42a632cd5206ee6e9bca1385c9e3cc561221af WatchSource:0}: Error finding container cc686e2b41cf9f2d4f0485784a42a632cd5206ee6e9bca1385c9e3cc561221af: Status 404 returned error can't find the container with id cc686e2b41cf9f2d4f0485784a42a632cd5206ee6e9bca1385c9e3cc561221af Apr 16 08:32:53.460763 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.460744 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-clvxz" Apr 16 08:32:53.466087 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.466067 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72c8255d_1385_43a9_b20b_b7dfd0472d12.slice/crio-9de07583cbec43b67db9f42fdf494f9e8f3a4d87dd22f045b463805cb158b688 WatchSource:0}: Error finding container 9de07583cbec43b67db9f42fdf494f9e8f3a4d87dd22f045b463805cb158b688: Status 404 returned error can't find the container with id 9de07583cbec43b67db9f42fdf494f9e8f3a4d87dd22f045b463805cb158b688 Apr 16 08:32:53.477558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.477540 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" Apr 16 08:32:53.482923 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.482903 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod940b59dd_2162_4388_aea6_b4ba3b7aab77.slice/crio-837c420283d2b72fac83d0b2e816b7621852b570d2b4009b5a625db04490a5f8 WatchSource:0}: Error finding container 837c420283d2b72fac83d0b2e816b7621852b570d2b4009b5a625db04490a5f8: Status 404 returned error can't find the container with id 837c420283d2b72fac83d0b2e816b7621852b570d2b4009b5a625db04490a5f8 Apr 16 08:32:53.494557 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.494540 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-t9jnz" Apr 16 08:32:53.499763 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.499746 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb5bb30d_eff2_430a_b4d1_c40e534c027f.slice/crio-e88719569a9b30f61957ca7500c88269d80bb7b91ff3ce8d64c308a8fe860136 WatchSource:0}: Error finding container e88719569a9b30f61957ca7500c88269d80bb7b91ff3ce8d64c308a8fe860136: Status 404 returned error can't find the container with id e88719569a9b30f61957ca7500c88269d80bb7b91ff3ce8d64c308a8fe860136 Apr 16 08:32:53.508620 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.508603 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:32:53.511507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.511474 2535 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:32:53.514308 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.514291 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod542120ec_6600_4198_bcc2_69755cfcf1d4.slice/crio-45a2104c2a55e0895ca130b12b25001496075119e716fa9c8cecb91c93587383 WatchSource:0}: Error finding container 45a2104c2a55e0895ca130b12b25001496075119e716fa9c8cecb91c93587383: Status 404 returned error can't find the container with id 45a2104c2a55e0895ca130b12b25001496075119e716fa9c8cecb91c93587383 Apr 16 08:32:53.536310 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:32:53.536288 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfaaf14a_4a7f_4c15_911d_d1a5a2f0c40f.slice/crio-72b0afb180facb935206d92f79c63a698f71bebccd69305b844eee5566a3d68c WatchSource:0}: Error finding container 72b0afb180facb935206d92f79c63a698f71bebccd69305b844eee5566a3d68c: Status 404 returned error can't find the container with id 72b0afb180facb935206d92f79c63a698f71bebccd69305b844eee5566a3d68c Apr 16 08:32:53.720972 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.720914 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:53.721139 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.721061 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:53.721139 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.721121 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:32:54.721100409 +0000 UTC m=+3.111954666 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:53.822869 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.822181 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:53.822869 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.822363 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:32:53.822869 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.822383 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:32:53.822869 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.822397 2535 projected.go:194] Error preparing data for projected volume kube-api-access-dpd7c for pod openshift-network-diagnostics/network-check-target-bfsfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:53.822869 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:53.822454 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c podName:3f8fae38-02fe-4a19-b915-1b456238b4eb nodeName:}" failed. No retries permitted until 2026-04-16 08:32:54.822436037 +0000 UTC m=+3.213290269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-dpd7c" (UniqueName: "kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c") pod "network-check-target-bfsfg" (UID: "3f8fae38-02fe-4a19-b915-1b456238b4eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:53.906121 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:53.905905 2535 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:32:54.142809 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.142712 2535 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:27:53 +0000 UTC" deadline="2028-01-28 23:08:59.799511286 +0000 UTC" Apr 16 08:32:54.142809 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.142760 2535 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15662h36m5.656756137s" Apr 16 08:32:54.274896 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.274681 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"45a2104c2a55e0895ca130b12b25001496075119e716fa9c8cecb91c93587383"} Apr 16 08:32:54.288946 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.288886 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9jnz" event={"ID":"bb5bb30d-eff2-430a-b4d1-c40e534c027f","Type":"ContainerStarted","Data":"e88719569a9b30f61957ca7500c88269d80bb7b91ff3ce8d64c308a8fe860136"} Apr 16 08:32:54.300665 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.299501 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerStarted","Data":"5d39213190c9812639f1523043cdef581a9001d57d07d140fb851b8b2068fe37"} Apr 16 08:32:54.312072 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.312041 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c4ztq" event={"ID":"7efe070a-8151-4485-9a0a-23daecd1d21c","Type":"ContainerStarted","Data":"ff3653bed783661f930e6d4b549961a1fdba416b1b423a0d153277fc6f3f704e"} Apr 16 08:32:54.329217 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.329182 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" event={"ID":"73b2365eabd261842813e22728dbf0cc","Type":"ContainerStarted","Data":"6810e9c53dae26888ed26d6877cde0fa0f97527f33d696da4f6634511980c9c5"} Apr 16 08:32:54.338565 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.338534 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v8qrs" event={"ID":"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f","Type":"ContainerStarted","Data":"72b0afb180facb935206d92f79c63a698f71bebccd69305b844eee5566a3d68c"} Apr 16 08:32:54.369887 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.369846 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" event={"ID":"940b59dd-2162-4388-aea6-b4ba3b7aab77","Type":"ContainerStarted","Data":"837c420283d2b72fac83d0b2e816b7621852b570d2b4009b5a625db04490a5f8"} Apr 16 08:32:54.376641 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.376607 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-clvxz" event={"ID":"72c8255d-1385-43a9-b20b-b7dfd0472d12","Type":"ContainerStarted","Data":"9de07583cbec43b67db9f42fdf494f9e8f3a4d87dd22f045b463805cb158b688"} Apr 16 08:32:54.379059 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.379031 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" event={"ID":"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91","Type":"ContainerStarted","Data":"cc686e2b41cf9f2d4f0485784a42a632cd5206ee6e9bca1385c9e3cc561221af"} Apr 16 08:32:54.386507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.386470 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p2z7c" event={"ID":"6ed7597d-e9d4-47fd-acaf-b04d3b412318","Type":"ContainerStarted","Data":"4dc44c9cb87335394823043909a6c69d688043ecadfbc6e30cb6aefb47499203"} Apr 16 08:32:54.405677 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.405593 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" event={"ID":"118bda75cde6d644622189268cb66453","Type":"ContainerStarted","Data":"7a2bc28b932cd30fc285d864781c3243a062d18ed59b29ca9c367b048f232947"} Apr 16 08:32:54.730836 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.730753 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:54.731003 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:54.730910 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:54.731003 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:54.730974 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:32:56.730956 +0000 UTC m=+5.121810210 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:54.832007 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:54.831953 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:54.832172 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:54.832127 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:32:54.832172 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:54.832147 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:32:54.832172 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:54.832159 2535 projected.go:194] Error preparing data for projected volume kube-api-access-dpd7c for pod openshift-network-diagnostics/network-check-target-bfsfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:54.832344 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:54.832227 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c podName:3f8fae38-02fe-4a19-b915-1b456238b4eb nodeName:}" failed. No retries permitted until 2026-04-16 08:32:56.832207332 +0000 UTC m=+5.223061560 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-dpd7c" (UniqueName: "kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c") pod "network-check-target-bfsfg" (UID: "3f8fae38-02fe-4a19-b915-1b456238b4eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:55.113639 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:55.113238 2535 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 08:32:55.143133 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:55.143031 2535 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 08:27:53 +0000 UTC" deadline="2027-12-22 00:05:16.776871168 +0000 UTC" Apr 16 08:32:55.143133 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:55.143066 2535 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14751h32m21.633808767s" Apr 16 08:32:55.253536 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:55.253504 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:55.253536 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:55.253518 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:55.253773 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:55.253663 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:32:55.253880 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:55.253785 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:32:56.747171 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:56.747121 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:56.747662 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:56.747279 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:56.747662 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:56.747338 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:00.747320875 +0000 UTC m=+9.138175094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:32:56.847979 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:56.847943 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:56.848159 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:56.848101 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:32:56.848159 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:56.848128 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:32:56.848159 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:56.848140 2535 projected.go:194] Error preparing data for projected volume kube-api-access-dpd7c for pod openshift-network-diagnostics/network-check-target-bfsfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:56.848313 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:56.848205 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c podName:3f8fae38-02fe-4a19-b915-1b456238b4eb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:00.848185723 +0000 UTC m=+9.239039943 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-dpd7c" (UniqueName: "kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c") pod "network-check-target-bfsfg" (UID: "3f8fae38-02fe-4a19-b915-1b456238b4eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:32:57.254072 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:57.254044 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:57.254247 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:57.254163 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:32:57.254582 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:57.254563 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:57.254687 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:57.254667 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:32:59.254328 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:59.253799 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:32:59.254328 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:32:59.253826 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:32:59.254328 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:59.253934 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:32:59.254328 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:32:59.254060 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:00.779949 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:00.779355 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:00.779949 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:00.779542 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:00.779949 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:00.779608 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:08.779589615 +0000 UTC m=+17.170443833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:00.879748 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:00.879709 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:00.879930 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:00.879897 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:00.879930 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:00.879919 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:00.879930 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:00.879930 2535 projected.go:194] Error preparing data for projected volume kube-api-access-dpd7c for pod openshift-network-diagnostics/network-check-target-bfsfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:00.880130 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:00.879987 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c podName:3f8fae38-02fe-4a19-b915-1b456238b4eb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:08.879974076 +0000 UTC m=+17.270828288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-dpd7c" (UniqueName: "kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c") pod "network-check-target-bfsfg" (UID: "3f8fae38-02fe-4a19-b915-1b456238b4eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:01.253971 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:01.253458 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:01.253971 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:01.253632 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:01.254292 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:01.253987 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:01.254292 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:01.254099 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:03.253753 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:03.253721 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:03.253753 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:03.253740 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:03.254222 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:03.253836 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:03.254222 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:03.253973 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:05.253672 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:05.253634 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:05.254159 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:05.253635 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:05.254159 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:05.253778 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:05.254159 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:05.253843 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:07.253854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.253816 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:07.253854 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.253838 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:07.254367 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:07.253941 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:07.254367 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:07.254061 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:07.456341 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.456314 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-bxg8l"] Apr 16 08:33:07.495145 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.495120 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.495316 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:07.495203 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:07.627934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.627855 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.627934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.627906 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ef850f58-6397-465e-9a95-6088bf0af066-dbus\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.627934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.627924 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ef850f58-6397-465e-9a95-6088bf0af066-kubelet-config\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.728685 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.728651 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ef850f58-6397-465e-9a95-6088bf0af066-dbus\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.728685 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.728691 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ef850f58-6397-465e-9a95-6088bf0af066-kubelet-config\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.728903 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.728752 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.728903 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.728823 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/ef850f58-6397-465e-9a95-6088bf0af066-kubelet-config\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.728903 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:07.728861 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:07.728903 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:07.728861 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/ef850f58-6397-465e-9a95-6088bf0af066-dbus\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:07.729051 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:07.728919 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret podName:ef850f58-6397-465e-9a95-6088bf0af066 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:08.228903013 +0000 UTC m=+16.619757221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret") pod "global-pull-secret-syncer-bxg8l" (UID: "ef850f58-6397-465e-9a95-6088bf0af066") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:08.232432 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:08.232398 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:08.232739 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.232572 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:08.232739 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.232636 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret podName:ef850f58-6397-465e-9a95-6088bf0af066 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:09.232620239 +0000 UTC m=+17.623474451 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret") pod "global-pull-secret-syncer-bxg8l" (UID: "ef850f58-6397-465e-9a95-6088bf0af066") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:08.838428 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:08.838391 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:08.838904 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.838574 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:08.838904 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.838647 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:24.838625383 +0000 UTC m=+33.229479616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:08.939169 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:08.939132 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:08.939365 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.939305 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:08.939365 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.939329 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:08.939365 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.939342 2535 projected.go:194] Error preparing data for projected volume kube-api-access-dpd7c for pod openshift-network-diagnostics/network-check-target-bfsfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:08.939522 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:08.939399 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c podName:3f8fae38-02fe-4a19-b915-1b456238b4eb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:24.939380453 +0000 UTC m=+33.330234719 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-dpd7c" (UniqueName: "kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c") pod "network-check-target-bfsfg" (UID: "3f8fae38-02fe-4a19-b915-1b456238b4eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:09.241839 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:09.241806 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:09.242028 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:09.241926 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:09.242028 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:09.241983 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret podName:ef850f58-6397-465e-9a95-6088bf0af066 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:11.241966975 +0000 UTC m=+19.632821185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret") pod "global-pull-secret-syncer-bxg8l" (UID: "ef850f58-6397-465e-9a95-6088bf0af066") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:09.253798 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:09.253766 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:09.253934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:09.253766 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:09.253934 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:09.253876 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:09.254054 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:09.253777 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:09.254054 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:09.253958 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:09.254146 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:09.254070 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:11.253552 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:11.253517 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:11.254004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:11.253560 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:11.254004 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:11.253659 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:11.254004 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:11.253694 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:11.254004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:11.253715 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:11.254004 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:11.253778 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:11.256731 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:11.256712 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:11.256830 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:11.256817 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:11.256876 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:11.256864 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret podName:ef850f58-6397-465e-9a95-6088bf0af066 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:15.256848536 +0000 UTC m=+23.647702744 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret") pod "global-pull-secret-syncer-bxg8l" (UID: "ef850f58-6397-465e-9a95-6088bf0af066") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:12.441650 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.441272 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-clvxz" event={"ID":"72c8255d-1385-43a9-b20b-b7dfd0472d12","Type":"ContainerStarted","Data":"817ff2222c304da5fa540baf7aaf0ec510ee11ab35ab974005366a804f709fc3"} Apr 16 08:33:12.442957 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.442761 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" event={"ID":"cd8eb931-b262-4a3d-a87a-1d92a4f2ac91","Type":"ContainerStarted","Data":"b217863d59ffc00d179d022ca95d227dfc5440ae6dd75009e83b50f2c04a8969"} Apr 16 08:33:12.443972 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.443951 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" event={"ID":"118bda75cde6d644622189268cb66453","Type":"ContainerStarted","Data":"2ec7842ea1456d0eec6e735785237496bb6530821fe29d9c572533876325bd2b"} Apr 16 08:33:12.446233 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446215 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:33:12.446523 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446501 2535 generic.go:358] "Generic (PLEG): container finished" podID="542120ec-6600-4198-bcc2-69755cfcf1d4" containerID="2b0f7555d7431888c03232657f642d42a12d41ccd9881f0ddaf559f4f002e7fe" exitCode=1 Apr 16 08:33:12.446615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446526 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"7140068d653190eb236a8eef2fd1c1021ccebb15df9689aeb63977ea27ab8058"} Apr 16 08:33:12.446615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446554 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"a53695ac25016249de6eeed85555ccf2ac37b9f5f3e58cf14373f7ef7d28975e"} Apr 16 08:33:12.446615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446563 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"76846db68c91a16a4f367f6ff9d438edc38be8f7e25a0861af711eebc0e88166"} Apr 16 08:33:12.446615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446590 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"7cf94a321e11c8b9617772256a42762816c1719c5c37df97629343e2415b1d94"} Apr 16 08:33:12.446615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446601 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerDied","Data":"2b0f7555d7431888c03232657f642d42a12d41ccd9881f0ddaf559f4f002e7fe"} Apr 16 08:33:12.446615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.446610 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"db188329f965bd7e0c957badb605e8229bd8643dfc7f8d1c21700b135448a4bd"} Apr 16 08:33:12.447675 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.447659 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-t9jnz" event={"ID":"bb5bb30d-eff2-430a-b4d1-c40e534c027f","Type":"ContainerStarted","Data":"9297a2359ebd0cd58a0e3adbc40d09b8d2f8cc48989fb48b427446905db3e6e1"} Apr 16 08:33:12.448878 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.448862 2535 generic.go:358] "Generic (PLEG): container finished" podID="c79b1bf6-7559-4369-a195-27e8876dde6f" containerID="bdab156ce10ffc0d60a3f7dc07acd9191a529f3f5260a64b45e0830edc5b6161" exitCode=0 Apr 16 08:33:12.448969 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.448886 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerDied","Data":"bdab156ce10ffc0d60a3f7dc07acd9191a529f3f5260a64b45e0830edc5b6161"} Apr 16 08:33:12.450166 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.450149 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-c4ztq" event={"ID":"7efe070a-8151-4485-9a0a-23daecd1d21c","Type":"ContainerStarted","Data":"6b01c363524ee8db7659e7f9836cfd1438806e7340ff299eef9d3796e1ccc9d1"} Apr 16 08:33:12.451355 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.451337 2535 generic.go:358] "Generic (PLEG): container finished" podID="73b2365eabd261842813e22728dbf0cc" containerID="8eb2081eb92b309f6c0edf1c6664660a8152aad9ec430378a44669b58e4167f4" exitCode=0 Apr 16 08:33:12.451431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.451402 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" event={"ID":"73b2365eabd261842813e22728dbf0cc","Type":"ContainerDied","Data":"8eb2081eb92b309f6c0edf1c6664660a8152aad9ec430378a44669b58e4167f4"} Apr 16 08:33:12.452602 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.452583 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-v8qrs" event={"ID":"cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f","Type":"ContainerStarted","Data":"478107da7327163a67dd72e6115f274d95aaada86b6689a4a2f3a08031b2581f"} Apr 16 08:33:12.453645 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.453629 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" event={"ID":"940b59dd-2162-4388-aea6-b4ba3b7aab77","Type":"ContainerStarted","Data":"54da6c7e5f860870ab53eef14c8309331bbfc601e329ec2caddaa41417634406"} Apr 16 08:33:12.454757 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.454725 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-clvxz" podStartSLOduration=2.393602355 podStartE2EDuration="20.454715379s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.467501739 +0000 UTC m=+1.858355962" lastFinishedPulling="2026-04-16 08:33:11.528614758 +0000 UTC m=+19.919468986" observedRunningTime="2026-04-16 08:33:12.454624982 +0000 UTC m=+20.845479234" watchObservedRunningTime="2026-04-16 08:33:12.454715379 +0000 UTC m=+20.845569609" Apr 16 08:33:12.470381 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.470344 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-144.ec2.internal" podStartSLOduration=19.470334466 podStartE2EDuration="19.470334466s" podCreationTimestamp="2026-04-16 08:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:33:12.469901628 +0000 UTC m=+20.860755860" watchObservedRunningTime="2026-04-16 08:33:12.470334466 +0000 UTC m=+20.861188674" Apr 16 08:33:12.483820 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.483785 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gxcvx" podStartSLOduration=2.403171123 podStartE2EDuration="20.483775861s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.446685744 +0000 UTC m=+1.837539953" lastFinishedPulling="2026-04-16 08:33:11.527290474 +0000 UTC m=+19.918144691" observedRunningTime="2026-04-16 08:33:12.483705741 +0000 UTC m=+20.874559971" watchObservedRunningTime="2026-04-16 08:33:12.483775861 +0000 UTC m=+20.874630093" Apr 16 08:33:12.501939 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.501904 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-t9jnz" podStartSLOduration=2.468701367 podStartE2EDuration="20.501894509s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.501049175 +0000 UTC m=+1.891903384" lastFinishedPulling="2026-04-16 08:33:11.534242316 +0000 UTC m=+19.925096526" observedRunningTime="2026-04-16 08:33:12.501629614 +0000 UTC m=+20.892483846" watchObservedRunningTime="2026-04-16 08:33:12.501894509 +0000 UTC m=+20.892748740" Apr 16 08:33:12.517520 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.517466 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-v8qrs" podStartSLOduration=2.529341048 podStartE2EDuration="20.517456472s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.537639424 +0000 UTC m=+1.928493634" lastFinishedPulling="2026-04-16 08:33:11.525754849 +0000 UTC m=+19.916609058" observedRunningTime="2026-04-16 08:33:12.517285567 +0000 UTC m=+20.908139798" watchObservedRunningTime="2026-04-16 08:33:12.517456472 +0000 UTC m=+20.908310703" Apr 16 08:33:12.540936 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:12.540845 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-c4ztq" podStartSLOduration=2.289001098 podStartE2EDuration="20.540835277s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.27394011 +0000 UTC m=+1.664794326" lastFinishedPulling="2026-04-16 08:33:11.525774281 +0000 UTC m=+19.916628505" observedRunningTime="2026-04-16 08:33:12.540529608 +0000 UTC m=+20.931383839" watchObservedRunningTime="2026-04-16 08:33:12.540835277 +0000 UTC m=+20.931689508" Apr 16 08:33:13.150262 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.150227 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:33:13.150953 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.150931 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:33:13.222997 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.222973 2535 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 08:33:13.254079 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.254046 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:13.254079 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.254080 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:13.254247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.254112 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:13.254247 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:13.254219 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:13.254428 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:13.254404 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:13.257509 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:13.254531 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:13.457373 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.457337 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" event={"ID":"940b59dd-2162-4388-aea6-b4ba3b7aab77","Type":"ContainerStarted","Data":"22f7e6928e54f4004437a5f1e120811256d39651a114c71014970df3004d8780"} Apr 16 08:33:13.458750 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.458721 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p2z7c" event={"ID":"6ed7597d-e9d4-47fd-acaf-b04d3b412318","Type":"ContainerStarted","Data":"6002aaa1a1adf6a7253ddc62bc18d599375765b794956b5f64b420754f6df043"} Apr 16 08:33:13.473719 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:13.473680 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p2z7c" podStartSLOduration=3.3773556510000002 podStartE2EDuration="21.473667351s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.429319984 +0000 UTC m=+1.820174193" lastFinishedPulling="2026-04-16 08:33:11.525631681 +0000 UTC m=+19.916485893" observedRunningTime="2026-04-16 08:33:13.473066421 +0000 UTC m=+21.863920653" watchObservedRunningTime="2026-04-16 08:33:13.473667351 +0000 UTC m=+21.864521582" Apr 16 08:33:14.191603 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.191376 2535 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T08:33:13.222994119Z","UUID":"ccac7c56-e697-442a-aa1a-adb66f1736c9","Handler":null,"Name":"","Endpoint":""} Apr 16 08:33:14.193065 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.193038 2535 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 08:33:14.193170 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.193080 2535 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 08:33:14.462739 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.462701 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" event={"ID":"940b59dd-2162-4388-aea6-b4ba3b7aab77","Type":"ContainerStarted","Data":"b74732979563877bd665a2ee0c328e7de507e3c0b31352449a95970157af6f9c"} Apr 16 08:33:14.465424 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.465403 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:33:14.465772 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.465742 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"72d7451ba4fe7ec04cd876931f1915e6d06c66df823409e2ce6144bd4830da11"} Apr 16 08:33:14.467419 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.467396 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" event={"ID":"73b2365eabd261842813e22728dbf0cc","Type":"ContainerStarted","Data":"52e36eea278311a9c578d5352af8a2a2f6c83a583c5f6004b58414d872c24a8b"} Apr 16 08:33:14.467550 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.467448 2535 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 08:33:14.493117 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.493067 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-jtkwb" podStartSLOduration=1.924533098 podStartE2EDuration="22.493050621s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.484258812 +0000 UTC m=+1.875113025" lastFinishedPulling="2026-04-16 08:33:14.05277632 +0000 UTC m=+22.443630548" observedRunningTime="2026-04-16 08:33:14.492819096 +0000 UTC m=+22.883673327" watchObservedRunningTime="2026-04-16 08:33:14.493050621 +0000 UTC m=+22.883904853" Apr 16 08:33:14.507684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:14.507635 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-144.ec2.internal" podStartSLOduration=21.507621359 podStartE2EDuration="21.507621359s" podCreationTimestamp="2026-04-16 08:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:33:14.507333287 +0000 UTC m=+22.898187518" watchObservedRunningTime="2026-04-16 08:33:14.507621359 +0000 UTC m=+22.898475591" Apr 16 08:33:15.254170 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:15.254134 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:15.254170 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:15.254155 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:15.254427 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:15.254134 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:15.254427 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:15.254258 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:15.254427 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:15.254408 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:15.254604 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:15.254534 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:15.285825 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:15.285801 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:15.285977 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:15.285918 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:15.285977 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:15.285973 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret podName:ef850f58-6397-465e-9a95-6088bf0af066 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:23.28595583 +0000 UTC m=+31.676810040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret") pod "global-pull-secret-syncer-bxg8l" (UID: "ef850f58-6397-465e-9a95-6088bf0af066") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:16.473738 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:16.473605 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:33:16.474098 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:16.474078 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"5c72e0796d948cd83372ccad081a7580939edd4dab4189a0234ee380aaa79f4d"} Apr 16 08:33:16.474389 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:16.474367 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:33:16.474514 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:16.474395 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:33:16.474594 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:16.474562 2535 scope.go:117] "RemoveContainer" containerID="2b0f7555d7431888c03232657f642d42a12d41ccd9881f0ddaf559f4f002e7fe" Apr 16 08:33:16.490595 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:16.490419 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:33:17.253922 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.253896 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:17.254108 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.253896 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:17.254108 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:17.253997 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:17.254108 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.253896 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:17.254108 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:17.254069 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:17.254230 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:17.254147 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:17.478928 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.478903 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:33:17.479706 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.479211 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" event={"ID":"542120ec-6600-4198-bcc2-69755cfcf1d4","Type":"ContainerStarted","Data":"32561d3626c944de0cc41aeee4cd7addbd2c3813b5156674d19fec78ea727440"} Apr 16 08:33:17.479706 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.479526 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:33:17.480918 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.480896 2535 generic.go:358] "Generic (PLEG): container finished" podID="c79b1bf6-7559-4369-a195-27e8876dde6f" containerID="fd652609ab424ffb98c87e2235311bdfc6270b5b548c67b09af8c2b988fab34f" exitCode=0 Apr 16 08:33:17.481048 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.480922 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerDied","Data":"fd652609ab424ffb98c87e2235311bdfc6270b5b548c67b09af8c2b988fab34f"} Apr 16 08:33:17.496329 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.496307 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:33:17.513137 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:17.513063 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" podStartSLOduration=7.403542522 podStartE2EDuration="25.513052794s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.515629929 +0000 UTC m=+1.906484138" lastFinishedPulling="2026-04-16 08:33:11.625140183 +0000 UTC m=+20.015994410" observedRunningTime="2026-04-16 08:33:17.512591799 +0000 UTC m=+25.903446030" watchObservedRunningTime="2026-04-16 08:33:17.513052794 +0000 UTC m=+25.903907055" Apr 16 08:33:18.399330 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.399158 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bxg8l"] Apr 16 08:33:18.399523 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.399408 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:18.399523 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:18.399480 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:18.403146 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.403109 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bfsfg"] Apr 16 08:33:18.403317 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.403301 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:18.403443 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:18.403423 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:18.404094 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.404074 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-88lvl"] Apr 16 08:33:18.404176 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.404152 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:18.404242 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:18.404227 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:18.483996 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.483969 2535 generic.go:358] "Generic (PLEG): container finished" podID="c79b1bf6-7559-4369-a195-27e8876dde6f" containerID="1fd06a315b0a360ef74236b0a64433b9d91d57e5651ba4e15455860897d17d3f" exitCode=0 Apr 16 08:33:18.484361 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:18.484052 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerDied","Data":"1fd06a315b0a360ef74236b0a64433b9d91d57e5651ba4e15455860897d17d3f"} Apr 16 08:33:19.488008 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:19.487970 2535 generic.go:358] "Generic (PLEG): container finished" podID="c79b1bf6-7559-4369-a195-27e8876dde6f" containerID="aa0fd7ae7af0c9cc129192314e5d510e7312c725cefaba5fc485f4bbdb658f0d" exitCode=0 Apr 16 08:33:19.488438 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:19.488058 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerDied","Data":"aa0fd7ae7af0c9cc129192314e5d510e7312c725cefaba5fc485f4bbdb658f0d"} Apr 16 08:33:20.254273 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:20.254240 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:20.254534 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:20.254248 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:20.254534 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:20.254381 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:20.254534 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:20.254252 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:20.254534 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:20.254466 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:20.254706 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:20.254548 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:21.573180 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:21.573150 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:33:21.573655 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:21.573305 2535 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 08:33:21.574010 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:21.573992 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-v8qrs" Apr 16 08:33:22.254262 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:22.254232 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:22.254517 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:22.254326 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:22.254517 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:22.254352 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:22.254517 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:22.254398 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:22.254517 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:22.254418 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:22.254918 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:22.254883 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:23.347647 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:23.347607 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:23.348036 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:23.347742 2535 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:23.348036 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:23.347816 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret podName:ef850f58-6397-465e-9a95-6088bf0af066 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:39.347799642 +0000 UTC m=+47.738653852 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret") pod "global-pull-secret-syncer-bxg8l" (UID: "ef850f58-6397-465e-9a95-6088bf0af066") : object "kube-system"/"original-pull-secret" not registered Apr 16 08:33:24.256785 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.256757 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:24.256785 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.256786 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:24.257032 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.256880 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:33:24.257032 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.256954 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-bxg8l" podUID="ef850f58-6397-465e-9a95-6088bf0af066" Apr 16 08:33:24.257032 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.257001 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:24.257154 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.257066 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bfsfg" podUID="3f8fae38-02fe-4a19-b915-1b456238b4eb" Apr 16 08:33:24.471406 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.471374 2535 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-144.ec2.internal" event="NodeReady" Apr 16 08:33:24.471843 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.471526 2535 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 08:33:24.506542 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.506517 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-bb65bb9d6-bsh82"] Apr 16 08:33:24.545111 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.545032 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j"] Apr 16 08:33:24.545252 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.545192 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.547510 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.547476 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 08:33:24.547649 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.547616 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 08:33:24.547649 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.547627 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dk699\"" Apr 16 08:33:24.547768 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.547663 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 08:33:24.553884 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.552717 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 08:33:24.560255 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.560230 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bb65bb9d6-bsh82"] Apr 16 08:33:24.560255 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.560258 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tzcjj"] Apr 16 08:33:24.560438 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.560370 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:24.562836 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.562567 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 08:33:24.562836 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.562638 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l29ml\"" Apr 16 08:33:24.562836 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.562647 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 08:33:24.584749 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.584730 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hgxtv"] Apr 16 08:33:24.584908 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.584890 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.587067 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.587047 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l2kgt\"" Apr 16 08:33:24.587224 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.587202 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 08:33:24.587335 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.587212 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 08:33:24.600472 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.600444 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j"] Apr 16 08:33:24.600472 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.600472 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hgxtv"] Apr 16 08:33:24.600615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.600485 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tzcjj"] Apr 16 08:33:24.600615 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.600585 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:24.602810 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.602783 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kp7vt\"" Apr 16 08:33:24.602908 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.602833 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 08:33:24.602908 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.602837 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 08:33:24.603026 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.602978 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 08:33:24.656839 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.656809 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-installation-pull-secrets\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.656839 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.656848 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.657043 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.656887 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-certificates\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.657043 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.656920 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c25e594-c9c3-4fc1-a428-ee29b551fad1-ca-trust-extracted\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.657043 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.656935 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-trusted-ca\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.657043 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.656960 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-image-registry-private-configuration\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.657043 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657018 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-bound-sa-token\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.657229 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657069 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:24.657229 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657094 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37aa253a-35df-4129-a89c-8aff6799646f-tmp-dir\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.657229 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657114 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h57h\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-kube-api-access-9h57h\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.657229 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657157 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/73644da9-c64b-4803-b5a4-ff849e2de647-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:24.657229 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657213 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37aa253a-35df-4129-a89c-8aff6799646f-config-volume\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.657410 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657234 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx9lp\" (UniqueName: \"kubernetes.io/projected/37aa253a-35df-4129-a89c-8aff6799646f-kube-api-access-tx9lp\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.657410 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.657261 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.757766 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.757730 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-certificates\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.757934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.757791 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c25e594-c9c3-4fc1-a428-ee29b551fad1-ca-trust-extracted\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.757934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.757820 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-trusted-ca\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.757934 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.757847 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-image-registry-private-configuration\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.758094 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758050 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-bound-sa-token\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.758145 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758108 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:24.758196 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758147 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:24.758196 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758179 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37aa253a-35df-4129-a89c-8aff6799646f-tmp-dir\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.758294 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758201 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9h57h\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-kube-api-access-9h57h\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.758294 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758234 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/73644da9-c64b-4803-b5a4-ff849e2de647-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:24.758294 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758267 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37aa253a-35df-4129-a89c-8aff6799646f-config-volume\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.758294 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758288 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tx9lp\" (UniqueName: \"kubernetes.io/projected/37aa253a-35df-4129-a89c-8aff6799646f-kube-api-access-tx9lp\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.758484 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758309 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c25e594-c9c3-4fc1-a428-ee29b551fad1-ca-trust-extracted\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.758484 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.758410 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:33:24.758606 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.758515 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:25.258472093 +0000 UTC m=+33.649326302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:33:24.758606 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758520 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-certificates\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.758606 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758602 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/37aa253a-35df-4129-a89c-8aff6799646f-tmp-dir\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.758801 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758699 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.758801 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758742 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kmnh\" (UniqueName: \"kubernetes.io/projected/35c727f2-1ff4-4364-a35e-46305564bbb8-kube-api-access-7kmnh\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:24.758801 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.758768 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:24.758959 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758802 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-installation-pull-secrets\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.758959 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.758829 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:33:25.258811225 +0000 UTC m=+33.649665438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:33:24.758959 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758871 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.758959 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758917 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-trusted-ca\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.759144 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.758963 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:33:24.759144 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.758975 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:33:24.759144 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.758981 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/73644da9-c64b-4803-b5a4-ff849e2de647-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:24.759144 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.759021 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:25.259009989 +0000 UTC m=+33.649864219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:33:24.759295 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.759177 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37aa253a-35df-4129-a89c-8aff6799646f-config-volume\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.762684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.762654 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-image-registry-private-configuration\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.762684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.762662 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-installation-pull-secrets\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.782124 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.782096 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-bound-sa-token\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.782236 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.782134 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9h57h\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-kube-api-access-9h57h\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:24.782236 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.782165 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx9lp\" (UniqueName: \"kubernetes.io/projected/37aa253a-35df-4129-a89c-8aff6799646f-kube-api-access-tx9lp\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:24.859759 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.859673 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kmnh\" (UniqueName: \"kubernetes.io/projected/35c727f2-1ff4-4364-a35e-46305564bbb8-kube-api-access-7kmnh\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:24.859759 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.859754 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:24.859988 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.859811 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:24.859988 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.859903 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:24.859988 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.859913 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:24.859988 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.859965 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:25.359946953 +0000 UTC m=+33.750801165 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:33:24.859988 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.859983 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:56.859974126 +0000 UTC m=+65.250828342 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 08:33:24.869103 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.869070 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kmnh\" (UniqueName: \"kubernetes.io/projected/35c727f2-1ff4-4364-a35e-46305564bbb8-kube-api-access-7kmnh\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:24.961228 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:24.961058 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:24.961228 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.961209 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 08:33:24.961228 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.961231 2535 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 08:33:24.961433 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.961244 2535 projected.go:194] Error preparing data for projected volume kube-api-access-dpd7c for pod openshift-network-diagnostics/network-check-target-bfsfg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:24.961433 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:24.961296 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c podName:3f8fae38-02fe-4a19-b915-1b456238b4eb nodeName:}" failed. No retries permitted until 2026-04-16 08:33:56.961279902 +0000 UTC m=+65.352134111 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-dpd7c" (UniqueName: "kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c") pod "network-check-target-bfsfg" (UID: "3f8fae38-02fe-4a19-b915-1b456238b4eb") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 08:33:25.262853 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:25.262809 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:25.263002 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:25.262904 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:25.263002 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.262962 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:33:25.263002 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.262986 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:33:25.263113 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.263024 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:25.263113 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:25.262963 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:25.263113 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.263044 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:33:25.263113 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.263050 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:26.263026145 +0000 UTC m=+34.653880371 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:33:25.263255 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.263118 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:33:26.263105259 +0000 UTC m=+34.653959468 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:33:25.263255 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.263132 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:26.263126057 +0000 UTC m=+34.653980266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:33:25.364241 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:25.364204 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:25.364383 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.364350 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:25.364433 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:25.364422 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:26.364407858 +0000 UTC m=+34.755262067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:33:26.254136 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.254101 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:26.254741 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.254101 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:26.254741 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.254101 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:26.256600 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.256575 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 08:33:26.257278 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.257264 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 08:33:26.258207 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.258192 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 08:33:26.258301 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.258224 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 08:33:26.258301 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.258218 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kd4nk\"" Apr 16 08:33:26.258529 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.258516 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-blgdz\"" Apr 16 08:33:26.270766 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.270743 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:26.270868 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.270811 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:26.270916 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.270875 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:26.270916 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.270880 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:26.271029 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.270931 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:33:28.270916819 +0000 UTC m=+36.661771027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:33:26.271029 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.270969 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:33:26.271029 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.271019 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:28.271002336 +0000 UTC m=+36.661856562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:33:26.271132 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.271077 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:33:26.271132 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.271088 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:33:26.271132 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.271120 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:28.271109607 +0000 UTC m=+36.661963834 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:33:26.372029 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.371995 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:26.372203 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.372185 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:26.372256 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:26.372247 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:28.372229952 +0000 UTC m=+36.763084164 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:33:26.502467 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.502437 2535 generic.go:358] "Generic (PLEG): container finished" podID="c79b1bf6-7559-4369-a195-27e8876dde6f" containerID="dc2fc0d27ab8772c10e1e9653851a1b17aad15b508a49135e95affb42bf34054" exitCode=0 Apr 16 08:33:26.502603 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:26.502510 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerDied","Data":"dc2fc0d27ab8772c10e1e9653851a1b17aad15b508a49135e95affb42bf34054"} Apr 16 08:33:27.507234 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:27.507203 2535 generic.go:358] "Generic (PLEG): container finished" podID="c79b1bf6-7559-4369-a195-27e8876dde6f" containerID="c08d0183a03e84b817b7d1d0474fe2a7e155086a5c9bd2833d691436e1d71bd2" exitCode=0 Apr 16 08:33:27.507695 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:27.507240 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerDied","Data":"c08d0183a03e84b817b7d1d0474fe2a7e155086a5c9bd2833d691436e1d71bd2"} Apr 16 08:33:28.288029 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:28.287959 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:28.288029 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:28.288012 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:28.288067 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.288106 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.288166 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:33:32.288152845 +0000 UTC m=+40.679007054 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.288168 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.288172 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.288189 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.288219 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:32.288204438 +0000 UTC m=+40.679058650 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:33:28.288229 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.288232 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:32.288226685 +0000 UTC m=+40.679080894 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:33:28.389378 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:28.389352 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:28.389589 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.389502 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:28.389589 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:28.389560 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:32.389547481 +0000 UTC m=+40.780401695 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:33:28.512351 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:28.512319 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" event={"ID":"c79b1bf6-7559-4369-a195-27e8876dde6f","Type":"ContainerStarted","Data":"a37fa512e77dba2853d4f135dd14742f8eda3b6b70dd7db30046a1444661ff11"} Apr 16 08:33:28.537431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:28.537386 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-g4m4l" podStartSLOduration=4.40248959 podStartE2EDuration="36.537372346s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:32:53.274002107 +0000 UTC m=+1.664856323" lastFinishedPulling="2026-04-16 08:33:25.408884854 +0000 UTC m=+33.799739079" observedRunningTime="2026-04-16 08:33:28.53627369 +0000 UTC m=+36.927127920" watchObservedRunningTime="2026-04-16 08:33:28.537372346 +0000 UTC m=+36.928226576" Apr 16 08:33:32.321070 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:32.321031 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:32.321100 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.321181 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.321192 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.321209 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.321226 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.321213776 +0000 UTC m=+48.712067985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.321259 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.321241943 +0000 UTC m=+48.712096165 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:32.321316 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.321402 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:32.321528 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.321431 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.321423738 +0000 UTC m=+48.712277947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:33:32.422397 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:32.422364 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:32.422595 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.422543 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:32.422640 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:32.422612 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:40.422591878 +0000 UTC m=+48.813446105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:33:39.380358 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:39.380316 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:39.383025 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:39.382997 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/ef850f58-6397-465e-9a95-6088bf0af066-original-pull-secret\") pod \"global-pull-secret-syncer-bxg8l\" (UID: \"ef850f58-6397-465e-9a95-6088bf0af066\") " pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:39.474266 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:39.474234 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-bxg8l" Apr 16 08:33:39.640664 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:39.640634 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-bxg8l"] Apr 16 08:33:39.644445 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:33:39.644418 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef850f58_6397_465e_9a95_6088bf0af066.slice/crio-f9f739fbc774851b1c2538c1883f7d4169b77adb3d0a417c90c530ca64b8d611 WatchSource:0}: Error finding container f9f739fbc774851b1c2538c1883f7d4169b77adb3d0a417c90c530ca64b8d611: Status 404 returned error can't find the container with id f9f739fbc774851b1c2538c1883f7d4169b77adb3d0a417c90c530ca64b8d611 Apr 16 08:33:40.387971 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:40.387932 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:40.387999 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:40.388036 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.388113 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.388136 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.388148 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.388148 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.388210 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:33:56.388188678 +0000 UTC m=+64.779042890 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.388315 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:56.388254526 +0000 UTC m=+64.779108750 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:33:40.388425 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.388401 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:56.38838745 +0000 UTC m=+64.779241660 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:33:40.489086 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:40.489053 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:40.489270 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.489192 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:40.489270 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:40.489255 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:33:56.489237507 +0000 UTC m=+64.880091722 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:33:40.535576 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:40.535526 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bxg8l" event={"ID":"ef850f58-6397-465e-9a95-6088bf0af066","Type":"ContainerStarted","Data":"f9f739fbc774851b1c2538c1883f7d4169b77adb3d0a417c90c530ca64b8d611"} Apr 16 08:33:43.543092 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:43.543045 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-bxg8l" event={"ID":"ef850f58-6397-465e-9a95-6088bf0af066","Type":"ContainerStarted","Data":"15a7f67054e66617db0725d0f42a2ea131b4803b8db33f83154e38a066f3e53b"} Apr 16 08:33:43.559248 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:43.559204 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-bxg8l" podStartSLOduration=32.892757343 podStartE2EDuration="36.559189546s" podCreationTimestamp="2026-04-16 08:33:07 +0000 UTC" firstStartedPulling="2026-04-16 08:33:39.646178955 +0000 UTC m=+48.037033177" lastFinishedPulling="2026-04-16 08:33:43.31261117 +0000 UTC m=+51.703465380" observedRunningTime="2026-04-16 08:33:43.558832971 +0000 UTC m=+51.949687201" watchObservedRunningTime="2026-04-16 08:33:43.559189546 +0000 UTC m=+51.950043817" Apr 16 08:33:49.498979 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:49.498951 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2b6j8" Apr 16 08:33:56.413893 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:56.413856 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:56.413907 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.413989 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.413992 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.414039 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:34:28.414025264 +0000 UTC m=+96.804879472 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:56.414054 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.414079 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:28.414073029 +0000 UTC m=+96.804927238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.414142 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.414158 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:33:56.414305 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.414197 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:28.414184701 +0000 UTC m=+96.805038909 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:33:56.515135 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:56.515108 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:33:56.515313 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.515256 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:33:56.515358 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.515323 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:34:28.515309017 +0000 UTC m=+96.906163239 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:33:56.919270 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:56.919241 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:33:56.921837 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:56.921821 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 08:33:56.929799 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.929783 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:33:56.929983 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:33:56.929841 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:00.929827249 +0000 UTC m=+129.320681458 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : secret "metrics-daemon-secret" not found Apr 16 08:33:57.019935 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.019897 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:57.022686 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.022663 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 08:33:57.032567 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.032547 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 08:33:57.043505 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.043471 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpd7c\" (UniqueName: \"kubernetes.io/projected/3f8fae38-02fe-4a19-b915-1b456238b4eb-kube-api-access-dpd7c\") pod \"network-check-target-bfsfg\" (UID: \"3f8fae38-02fe-4a19-b915-1b456238b4eb\") " pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:57.172011 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.171927 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-blgdz\"" Apr 16 08:33:57.179671 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.179647 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:33:57.298617 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.298589 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bfsfg"] Apr 16 08:33:57.301550 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:33:57.301523 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f8fae38_02fe_4a19_b915_1b456238b4eb.slice/crio-56368645addb30415ee32a046a3a886bc0639aaa11182dcf00dfcb9b9e874ff1 WatchSource:0}: Error finding container 56368645addb30415ee32a046a3a886bc0639aaa11182dcf00dfcb9b9e874ff1: Status 404 returned error can't find the container with id 56368645addb30415ee32a046a3a886bc0639aaa11182dcf00dfcb9b9e874ff1 Apr 16 08:33:57.570647 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:33:57.570566 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bfsfg" event={"ID":"3f8fae38-02fe-4a19-b915-1b456238b4eb","Type":"ContainerStarted","Data":"56368645addb30415ee32a046a3a886bc0639aaa11182dcf00dfcb9b9e874ff1"} Apr 16 08:34:00.576552 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:00.576512 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bfsfg" event={"ID":"3f8fae38-02fe-4a19-b915-1b456238b4eb","Type":"ContainerStarted","Data":"3885e55ff0ef277809c99c99965289419e3021b4274f972ca2cebc444fffb290"} Apr 16 08:34:00.576944 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:00.576621 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:34:00.595186 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:00.595141 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bfsfg" podStartSLOduration=65.855499542 podStartE2EDuration="1m8.595116647s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:33:57.303297063 +0000 UTC m=+65.694151272" lastFinishedPulling="2026-04-16 08:34:00.042913943 +0000 UTC m=+68.433768377" observedRunningTime="2026-04-16 08:34:00.594541631 +0000 UTC m=+68.985395862" watchObservedRunningTime="2026-04-16 08:34:00.595116647 +0000 UTC m=+68.985970871" Apr 16 08:34:28.440431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:28.440385 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:28.440445 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:28.440485 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.440540 2535 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.440577 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.440585 2535 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.440603 2535 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-bb65bb9d6-bsh82: secret "image-registry-tls" not found Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.440615 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls podName:37aa253a-35df-4129-a89c-8aff6799646f nodeName:}" failed. No retries permitted until 2026-04-16 08:35:32.440593208 +0000 UTC m=+160.831447417 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls") pod "dns-default-tzcjj" (UID: "37aa253a-35df-4129-a89c-8aff6799646f") : secret "dns-default-metrics-tls" not found Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.440678 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls podName:0c25e594-c9c3-4fc1-a428-ee29b551fad1 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:32.440656751 +0000 UTC m=+160.831510974 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls") pod "image-registry-bb65bb9d6-bsh82" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1") : secret "image-registry-tls" not found Apr 16 08:34:28.440887 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.440699 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:32.440689279 +0000 UTC m=+160.831543488 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:34:28.542224 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:28.542181 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:34:28.542434 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.542369 2535 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 08:34:28.542514 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:34:28.542447 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert podName:35c727f2-1ff4-4364-a35e-46305564bbb8 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:32.542420471 +0000 UTC m=+160.933274699 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert") pod "ingress-canary-hgxtv" (UID: "35c727f2-1ff4-4364-a35e-46305564bbb8") : secret "canary-serving-cert" not found Apr 16 08:34:31.580148 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:34:31.580118 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bfsfg" Apr 16 08:35:00.968215 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:00.968170 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:35:00.968709 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:00.968311 2535 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 08:35:00.968709 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:00.968381 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs podName:7146c66d-f9a6-4b2c-8f79-e72ee1b00021 nodeName:}" failed. No retries permitted until 2026-04-16 08:37:02.968365389 +0000 UTC m=+251.359219598 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs") pod "network-metrics-daemon-88lvl" (UID: "7146c66d-f9a6-4b2c-8f79-e72ee1b00021") : secret "metrics-daemon-secret" not found Apr 16 08:35:15.175387 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.175353 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8"] Apr 16 08:35:15.178198 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.178183 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:15.180616 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.180590 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 08:35:15.180616 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.180591 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 08:35:15.181401 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.181383 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 08:35:15.181462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.181383 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-5z9xb\"" Apr 16 08:35:15.190560 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.190540 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8"] Apr 16 08:35:15.269882 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.269855 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnpb\" (UniqueName: \"kubernetes.io/projected/38bca098-9687-4bfd-b00e-f6d693552418-kube-api-access-tnnpb\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:15.270016 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.269977 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:15.370548 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.370513 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:15.370548 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.370557 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnpb\" (UniqueName: \"kubernetes.io/projected/38bca098-9687-4bfd-b00e-f6d693552418-kube-api-access-tnnpb\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:15.370746 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:15.370673 2535 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:35:15.370781 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:15.370748 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls podName:38bca098-9687-4bfd-b00e-f6d693552418 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:15.870729472 +0000 UTC m=+144.261583695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls") pod "cluster-samples-operator-667775844f-nzlt8" (UID: "38bca098-9687-4bfd-b00e-f6d693552418") : secret "samples-operator-tls" not found Apr 16 08:35:15.390836 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.390805 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnpb\" (UniqueName: \"kubernetes.io/projected/38bca098-9687-4bfd-b00e-f6d693552418-kube-api-access-tnnpb\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:15.874034 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:15.873993 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:15.874221 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:15.874148 2535 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:35:15.874221 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:15.874213 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls podName:38bca098-9687-4bfd-b00e-f6d693552418 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:16.87419625 +0000 UTC m=+145.265050464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls") pod "cluster-samples-operator-667775844f-nzlt8" (UID: "38bca098-9687-4bfd-b00e-f6d693552418") : secret "samples-operator-tls" not found Apr 16 08:35:16.882573 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:16.882530 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:16.882960 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:16.882647 2535 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:35:16.882960 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:16.882700 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls podName:38bca098-9687-4bfd-b00e-f6d693552418 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:18.882687636 +0000 UTC m=+147.273541845 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls") pod "cluster-samples-operator-667775844f-nzlt8" (UID: "38bca098-9687-4bfd-b00e-f6d693552418") : secret "samples-operator-tls" not found Apr 16 08:35:18.901037 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:18.901001 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:18.901429 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:18.901147 2535 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:35:18.901429 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:18.901221 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls podName:38bca098-9687-4bfd-b00e-f6d693552418 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:22.90120521 +0000 UTC m=+151.292059420 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls") pod "cluster-samples-operator-667775844f-nzlt8" (UID: "38bca098-9687-4bfd-b00e-f6d693552418") : secret "samples-operator-tls" not found Apr 16 08:35:21.860033 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:21.860007 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-clvxz_72c8255d-1385-43a9-b20b-b7dfd0472d12/dns-node-resolver/0.log" Apr 16 08:35:22.484538 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.484507 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-h5psr"] Apr 16 08:35:22.487392 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.487378 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.490209 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.490184 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-ms4f9\"" Apr 16 08:35:22.490446 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.490427 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 08:35:22.491042 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.491027 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 08:35:22.491127 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.491058 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 08:35:22.491127 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.491119 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 08:35:22.500146 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.500118 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-h5psr"] Apr 16 08:35:22.527650 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.527628 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-signing-key\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.527776 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.527664 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4skj8\" (UniqueName: \"kubernetes.io/projected/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-kube-api-access-4skj8\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.527776 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.527731 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-signing-cabundle\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.628657 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.628632 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-signing-key\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.628754 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.628671 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4skj8\" (UniqueName: \"kubernetes.io/projected/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-kube-api-access-4skj8\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.628754 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.628714 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-signing-cabundle\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.629327 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.629304 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-signing-cabundle\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.630932 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.630906 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-signing-key\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.638163 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.638139 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4skj8\" (UniqueName: \"kubernetes.io/projected/52d0103e-fd43-46cc-8ea6-1da36a3df9e7-kube-api-access-4skj8\") pod \"service-ca-bfc587fb7-h5psr\" (UID: \"52d0103e-fd43-46cc-8ea6-1da36a3df9e7\") " pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.657824 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.657803 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-c4ztq_7efe070a-8151-4485-9a0a-23daecd1d21c/node-ca/0.log" Apr 16 08:35:22.795859 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.795762 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" Apr 16 08:35:22.910662 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.910632 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-h5psr"] Apr 16 08:35:22.913463 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:35:22.913436 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d0103e_fd43_46cc_8ea6_1da36a3df9e7.slice/crio-f114f063a16b058555da964e6c2a61a8f6c249125240320b93b4c19f9b88af13 WatchSource:0}: Error finding container f114f063a16b058555da964e6c2a61a8f6c249125240320b93b4c19f9b88af13: Status 404 returned error can't find the container with id f114f063a16b058555da964e6c2a61a8f6c249125240320b93b4c19f9b88af13 Apr 16 08:35:22.931012 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:22.930985 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:22.931137 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:22.931120 2535 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 08:35:22.931182 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:22.931176 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls podName:38bca098-9687-4bfd-b00e-f6d693552418 nodeName:}" failed. No retries permitted until 2026-04-16 08:35:30.931161381 +0000 UTC m=+159.322015594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls") pod "cluster-samples-operator-667775844f-nzlt8" (UID: "38bca098-9687-4bfd-b00e-f6d693552418") : secret "samples-operator-tls" not found Apr 16 08:35:23.734449 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:23.734408 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" event={"ID":"52d0103e-fd43-46cc-8ea6-1da36a3df9e7","Type":"ContainerStarted","Data":"f114f063a16b058555da964e6c2a61a8f6c249125240320b93b4c19f9b88af13"} Apr 16 08:35:25.739371 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:25.739334 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" event={"ID":"52d0103e-fd43-46cc-8ea6-1da36a3df9e7","Type":"ContainerStarted","Data":"f5e62fd19e90c208b68ff9d1fdd42f75ff011f7099aa26198bebbad6d34e1a7a"} Apr 16 08:35:25.756977 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:25.756930 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-h5psr" podStartSLOduration=1.857189259 podStartE2EDuration="3.75691869s" podCreationTimestamp="2026-04-16 08:35:22 +0000 UTC" firstStartedPulling="2026-04-16 08:35:22.917987145 +0000 UTC m=+151.308841357" lastFinishedPulling="2026-04-16 08:35:24.817716575 +0000 UTC m=+153.208570788" observedRunningTime="2026-04-16 08:35:25.756859296 +0000 UTC m=+154.147713527" watchObservedRunningTime="2026-04-16 08:35:25.75691869 +0000 UTC m=+154.147772921" Apr 16 08:35:27.559114 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:27.559076 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" podUID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" Apr 16 08:35:27.570191 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:27.570165 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" podUID="73644da9-c64b-4803-b5a4-ff849e2de647" Apr 16 08:35:27.595593 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:27.595565 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-tzcjj" podUID="37aa253a-35df-4129-a89c-8aff6799646f" Apr 16 08:35:27.610731 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:27.610701 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hgxtv" podUID="35c727f2-1ff4-4364-a35e-46305564bbb8" Apr 16 08:35:27.742893 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:27.742866 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tzcjj" Apr 16 08:35:27.743057 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:27.742978 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:35:27.743149 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:27.743136 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:35:29.263690 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:29.263649 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-88lvl" podUID="7146c66d-f9a6-4b2c-8f79-e72ee1b00021" Apr 16 08:35:30.999461 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:30.999426 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:31.001707 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:31.001686 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/38bca098-9687-4bfd-b00e-f6d693552418-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-nzlt8\" (UID: \"38bca098-9687-4bfd-b00e-f6d693552418\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:31.086539 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:31.086510 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" Apr 16 08:35:31.197250 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:31.197216 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8"] Apr 16 08:35:31.752321 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:31.752279 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" event={"ID":"38bca098-9687-4bfd-b00e-f6d693552418","Type":"ContainerStarted","Data":"4d5a57864467a975d640e33095d4b1f2726fcc9ac05ca9d348506a77d34674b6"} Apr 16 08:35:32.515332 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.515282 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:35:32.515801 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.515363 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:35:32.515801 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:32.515396 2535 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 08:35:32.515801 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:35:32.515476 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert podName:73644da9-c64b-4803-b5a4-ff849e2de647 nodeName:}" failed. No retries permitted until 2026-04-16 08:37:34.515459317 +0000 UTC m=+282.906313530 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-frz7j" (UID: "73644da9-c64b-4803-b5a4-ff849e2de647") : secret "networking-console-plugin-cert" not found Apr 16 08:35:32.515801 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.515402 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:35:32.518081 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.518060 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"image-registry-bb65bb9d6-bsh82\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:35:32.518877 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.518843 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37aa253a-35df-4129-a89c-8aff6799646f-metrics-tls\") pod \"dns-default-tzcjj\" (UID: \"37aa253a-35df-4129-a89c-8aff6799646f\") " pod="openshift-dns/dns-default-tzcjj" Apr 16 08:35:32.546446 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.546424 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dk699\"" Apr 16 08:35:32.546591 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.546502 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-l2kgt\"" Apr 16 08:35:32.554378 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.554307 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tzcjj" Apr 16 08:35:32.554514 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.554436 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:35:32.616294 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.616267 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:35:32.619324 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.619284 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/35c727f2-1ff4-4364-a35e-46305564bbb8-cert\") pod \"ingress-canary-hgxtv\" (UID: \"35c727f2-1ff4-4364-a35e-46305564bbb8\") " pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:35:32.914472 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.914446 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tzcjj"] Apr 16 08:35:32.918881 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:35:32.918852 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37aa253a_35df_4129_a89c_8aff6799646f.slice/crio-047d264d0da333a932c67e21d81088dc65752ae8059fa831a1d078dc17218144 WatchSource:0}: Error finding container 047d264d0da333a932c67e21d81088dc65752ae8059fa831a1d078dc17218144: Status 404 returned error can't find the container with id 047d264d0da333a932c67e21d81088dc65752ae8059fa831a1d078dc17218144 Apr 16 08:35:32.938392 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:32.938329 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-bb65bb9d6-bsh82"] Apr 16 08:35:32.940161 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:35:32.940125 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c25e594_c9c3_4fc1_a428_ee29b551fad1.slice/crio-9b781651079cf7c724fae49721027e90e3b79541946b9eec0fd9d24938057701 WatchSource:0}: Error finding container 9b781651079cf7c724fae49721027e90e3b79541946b9eec0fd9d24938057701: Status 404 returned error can't find the container with id 9b781651079cf7c724fae49721027e90e3b79541946b9eec0fd9d24938057701 Apr 16 08:35:33.759784 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.759752 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" event={"ID":"38bca098-9687-4bfd-b00e-f6d693552418","Type":"ContainerStarted","Data":"9353e5b16acdda6dedad427b97d17d9aec87ef90dae37194a98737c6a1327396"} Apr 16 08:35:33.759784 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.759790 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" event={"ID":"38bca098-9687-4bfd-b00e-f6d693552418","Type":"ContainerStarted","Data":"dd390aea198fca8d3ce1d0f2e362080a00c03d26671c7cbaede6e91378e0d443"} Apr 16 08:35:33.760919 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.760893 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tzcjj" event={"ID":"37aa253a-35df-4129-a89c-8aff6799646f","Type":"ContainerStarted","Data":"047d264d0da333a932c67e21d81088dc65752ae8059fa831a1d078dc17218144"} Apr 16 08:35:33.762087 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.762068 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" event={"ID":"0c25e594-c9c3-4fc1-a428-ee29b551fad1","Type":"ContainerStarted","Data":"b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8"} Apr 16 08:35:33.762178 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.762091 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" event={"ID":"0c25e594-c9c3-4fc1-a428-ee29b551fad1","Type":"ContainerStarted","Data":"9b781651079cf7c724fae49721027e90e3b79541946b9eec0fd9d24938057701"} Apr 16 08:35:33.762248 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.762229 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:35:33.778733 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.778684 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-nzlt8" podStartSLOduration=17.189450347 podStartE2EDuration="18.778665821s" podCreationTimestamp="2026-04-16 08:35:15 +0000 UTC" firstStartedPulling="2026-04-16 08:35:31.242628203 +0000 UTC m=+159.633482414" lastFinishedPulling="2026-04-16 08:35:32.831843679 +0000 UTC m=+161.222697888" observedRunningTime="2026-04-16 08:35:33.777582301 +0000 UTC m=+162.168436529" watchObservedRunningTime="2026-04-16 08:35:33.778665821 +0000 UTC m=+162.169520053" Apr 16 08:35:33.798211 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:33.798161 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" podStartSLOduration=161.798146616 podStartE2EDuration="2m41.798146616s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:35:33.797384915 +0000 UTC m=+162.188239172" watchObservedRunningTime="2026-04-16 08:35:33.798146616 +0000 UTC m=+162.189000847" Apr 16 08:35:34.766785 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:34.766741 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tzcjj" event={"ID":"37aa253a-35df-4129-a89c-8aff6799646f","Type":"ContainerStarted","Data":"cc863f2c07e49c12e652616327c5d46d424b4c6ffeae25419f1e6b4722a7685d"} Apr 16 08:35:34.766785 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:34.766788 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tzcjj" event={"ID":"37aa253a-35df-4129-a89c-8aff6799646f","Type":"ContainerStarted","Data":"4afedc954c80c93694b648a60f2ff0c768884bd2d5589e136d875f9fbfe8078e"} Apr 16 08:35:34.767388 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:34.766843 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tzcjj" Apr 16 08:35:34.784772 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:34.784725 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tzcjj" podStartSLOduration=129.548719513 podStartE2EDuration="2m10.784709286s" podCreationTimestamp="2026-04-16 08:33:24 +0000 UTC" firstStartedPulling="2026-04-16 08:35:32.921603486 +0000 UTC m=+161.312457694" lastFinishedPulling="2026-04-16 08:35:34.157593257 +0000 UTC m=+162.548447467" observedRunningTime="2026-04-16 08:35:34.784407876 +0000 UTC m=+163.175262106" watchObservedRunningTime="2026-04-16 08:35:34.784709286 +0000 UTC m=+163.175563524" Apr 16 08:35:39.253823 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:39.253735 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:35:39.256351 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:39.256330 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-kp7vt\"" Apr 16 08:35:39.265071 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:39.265055 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hgxtv" Apr 16 08:35:39.381016 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:39.380986 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hgxtv"] Apr 16 08:35:39.384210 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:35:39.384183 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35c727f2_1ff4_4364_a35e_46305564bbb8.slice/crio-7a922826e0ebe559f5061763e3b9a10cff5918d4afcca50fae730216ac25c77c WatchSource:0}: Error finding container 7a922826e0ebe559f5061763e3b9a10cff5918d4afcca50fae730216ac25c77c: Status 404 returned error can't find the container with id 7a922826e0ebe559f5061763e3b9a10cff5918d4afcca50fae730216ac25c77c Apr 16 08:35:39.782167 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:39.782120 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hgxtv" event={"ID":"35c727f2-1ff4-4364-a35e-46305564bbb8","Type":"ContainerStarted","Data":"7a922826e0ebe559f5061763e3b9a10cff5918d4afcca50fae730216ac25c77c"} Apr 16 08:35:41.790159 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:41.790125 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hgxtv" event={"ID":"35c727f2-1ff4-4364-a35e-46305564bbb8","Type":"ContainerStarted","Data":"31d88d58ad59d1afe1a96b9e03e2df130c897fccf34e16fb9df5f2775d9f0039"} Apr 16 08:35:41.807277 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:41.807234 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hgxtv" podStartSLOduration=136.351098985 podStartE2EDuration="2m17.807221441s" podCreationTimestamp="2026-04-16 08:33:24 +0000 UTC" firstStartedPulling="2026-04-16 08:35:39.385944976 +0000 UTC m=+167.776799189" lastFinishedPulling="2026-04-16 08:35:40.842067436 +0000 UTC m=+169.232921645" observedRunningTime="2026-04-16 08:35:41.806300626 +0000 UTC m=+170.197154860" watchObservedRunningTime="2026-04-16 08:35:41.807221441 +0000 UTC m=+170.198075672" Apr 16 08:35:43.253669 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.253630 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:35:43.630120 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.630047 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2j9jj"] Apr 16 08:35:43.633423 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.633402 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.646463 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.646439 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 08:35:43.647322 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.647301 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rqvkt\"" Apr 16 08:35:43.647403 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.647301 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 08:35:43.648687 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.648668 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 08:35:43.649616 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.649601 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 08:35:43.656431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.656407 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2j9jj"] Apr 16 08:35:43.801192 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.801164 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8fb5ded9-9208-45a5-92ea-b444e485f55b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.801192 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.801196 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzdrs\" (UniqueName: \"kubernetes.io/projected/8fb5ded9-9208-45a5-92ea-b444e485f55b-kube-api-access-dzdrs\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.801409 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.801217 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8fb5ded9-9208-45a5-92ea-b444e485f55b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.801409 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.801285 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8fb5ded9-9208-45a5-92ea-b444e485f55b-data-volume\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.801409 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.801331 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8fb5ded9-9208-45a5-92ea-b444e485f55b-crio-socket\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902231 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902197 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8fb5ded9-9208-45a5-92ea-b444e485f55b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902231 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902231 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzdrs\" (UniqueName: \"kubernetes.io/projected/8fb5ded9-9208-45a5-92ea-b444e485f55b-kube-api-access-dzdrs\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902455 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902258 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8fb5ded9-9208-45a5-92ea-b444e485f55b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902455 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902298 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8fb5ded9-9208-45a5-92ea-b444e485f55b-data-volume\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902455 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902316 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8fb5ded9-9208-45a5-92ea-b444e485f55b-crio-socket\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902455 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902394 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8fb5ded9-9208-45a5-92ea-b444e485f55b-crio-socket\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902766 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902744 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8fb5ded9-9208-45a5-92ea-b444e485f55b-data-volume\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.902839 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.902822 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8fb5ded9-9208-45a5-92ea-b444e485f55b-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.904627 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.904600 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8fb5ded9-9208-45a5-92ea-b444e485f55b-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.921948 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.921919 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzdrs\" (UniqueName: \"kubernetes.io/projected/8fb5ded9-9208-45a5-92ea-b444e485f55b-kube-api-access-dzdrs\") pod \"insights-runtime-extractor-2j9jj\" (UID: \"8fb5ded9-9208-45a5-92ea-b444e485f55b\") " pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:43.942172 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:43.942155 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2j9jj" Apr 16 08:35:44.062683 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:44.062650 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2j9jj"] Apr 16 08:35:44.065682 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:35:44.065658 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fb5ded9_9208_45a5_92ea_b444e485f55b.slice/crio-1f92db79c4091751f03e70eb640fe99aba915eab8efce40fc196b8ed89f66ca2 WatchSource:0}: Error finding container 1f92db79c4091751f03e70eb640fe99aba915eab8efce40fc196b8ed89f66ca2: Status 404 returned error can't find the container with id 1f92db79c4091751f03e70eb640fe99aba915eab8efce40fc196b8ed89f66ca2 Apr 16 08:35:44.772844 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:44.772819 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tzcjj" Apr 16 08:35:44.799713 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:44.799685 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j9jj" event={"ID":"8fb5ded9-9208-45a5-92ea-b444e485f55b","Type":"ContainerStarted","Data":"e4c1e3d75e2f4228026a3e3da938d48238560b873f876249f2d297108955b435"} Apr 16 08:35:44.799713 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:44.799714 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j9jj" event={"ID":"8fb5ded9-9208-45a5-92ea-b444e485f55b","Type":"ContainerStarted","Data":"555f23258c603cdefb429ae31da4d3aeb25d46d531e6cd6ef60c0fa500bbc11c"} Apr 16 08:35:44.799855 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:44.799723 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j9jj" event={"ID":"8fb5ded9-9208-45a5-92ea-b444e485f55b","Type":"ContainerStarted","Data":"1f92db79c4091751f03e70eb640fe99aba915eab8efce40fc196b8ed89f66ca2"} Apr 16 08:35:46.805992 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:46.805957 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2j9jj" event={"ID":"8fb5ded9-9208-45a5-92ea-b444e485f55b","Type":"ContainerStarted","Data":"bd4bbeee60d728f03c5cda8f78ad181524e0d77ee9e43369bf7e732c3fd07dc3"} Apr 16 08:35:46.824990 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:46.824949 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2j9jj" podStartSLOduration=1.8814750569999998 podStartE2EDuration="3.824935981s" podCreationTimestamp="2026-04-16 08:35:43 +0000 UTC" firstStartedPulling="2026-04-16 08:35:44.11607284 +0000 UTC m=+172.506927056" lastFinishedPulling="2026-04-16 08:35:46.059533768 +0000 UTC m=+174.450387980" observedRunningTime="2026-04-16 08:35:46.823852201 +0000 UTC m=+175.214706432" watchObservedRunningTime="2026-04-16 08:35:46.824935981 +0000 UTC m=+175.215790212" Apr 16 08:35:48.503935 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.503898 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-75cbddd885-f99nj"] Apr 16 08:35:48.506827 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.506811 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.508973 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.508952 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 08:35:48.509101 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.509064 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jcp87\"" Apr 16 08:35:48.509153 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.509140 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 08:35:48.510904 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.510320 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 08:35:48.510904 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.510365 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 08:35:48.510904 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.510408 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 08:35:48.510904 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.510432 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 08:35:48.510904 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.510327 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 08:35:48.516890 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.516872 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cbddd885-f99nj"] Apr 16 08:35:48.637137 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.637107 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgw5r\" (UniqueName: \"kubernetes.io/projected/701168b5-5d8e-4798-849a-64ab0e79bf80-kube-api-access-qgw5r\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.637137 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.637141 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-service-ca\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.637374 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.637174 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-oauth-serving-cert\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.637374 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.637272 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-oauth-config\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.637374 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.637328 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-console-config\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.637526 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.637373 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-serving-cert\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.738689 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.738660 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-oauth-serving-cert\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.738855 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.738709 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-oauth-config\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.738855 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.738757 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-console-config\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.738938 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.738847 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-serving-cert\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.738938 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.738893 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgw5r\" (UniqueName: \"kubernetes.io/projected/701168b5-5d8e-4798-849a-64ab0e79bf80-kube-api-access-qgw5r\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.738938 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.738930 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-service-ca\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.739459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.739421 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-oauth-serving-cert\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.739459 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.739421 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-console-config\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.739658 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.739595 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-service-ca\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.741144 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.741122 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-oauth-config\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.741259 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.741236 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-serving-cert\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.747862 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.747840 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgw5r\" (UniqueName: \"kubernetes.io/projected/701168b5-5d8e-4798-849a-64ab0e79bf80-kube-api-access-qgw5r\") pod \"console-75cbddd885-f99nj\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.817367 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.817305 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:48.944920 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:48.944894 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-75cbddd885-f99nj"] Apr 16 08:35:48.947712 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:35:48.947686 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod701168b5_5d8e_4798_849a_64ab0e79bf80.slice/crio-94ece12427c730469fce497f9505f0a237cdc9bac905b35f63b4b15edb8ed5c2 WatchSource:0}: Error finding container 94ece12427c730469fce497f9505f0a237cdc9bac905b35f63b4b15edb8ed5c2: Status 404 returned error can't find the container with id 94ece12427c730469fce497f9505f0a237cdc9bac905b35f63b4b15edb8ed5c2 Apr 16 08:35:49.814481 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:49.814439 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cbddd885-f99nj" event={"ID":"701168b5-5d8e-4798-849a-64ab0e79bf80","Type":"ContainerStarted","Data":"94ece12427c730469fce497f9505f0a237cdc9bac905b35f63b4b15edb8ed5c2"} Apr 16 08:35:51.821214 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:51.821177 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cbddd885-f99nj" event={"ID":"701168b5-5d8e-4798-849a-64ab0e79bf80","Type":"ContainerStarted","Data":"ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637"} Apr 16 08:35:51.839909 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:51.839797 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-75cbddd885-f99nj" podStartSLOduration=1.212598711 podStartE2EDuration="3.83978195s" podCreationTimestamp="2026-04-16 08:35:48 +0000 UTC" firstStartedPulling="2026-04-16 08:35:48.949584527 +0000 UTC m=+177.340438735" lastFinishedPulling="2026-04-16 08:35:51.576767566 +0000 UTC m=+179.967621974" observedRunningTime="2026-04-16 08:35:51.839163491 +0000 UTC m=+180.230017733" watchObservedRunningTime="2026-04-16 08:35:51.83978195 +0000 UTC m=+180.230636181" Apr 16 08:35:52.558288 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:52.558247 2535 patch_prober.go:28] interesting pod/image-registry-bb65bb9d6-bsh82 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 08:35:52.558454 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:52.558316 2535 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" podUID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 08:35:54.771533 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:54.771485 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:35:58.817905 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:58.817866 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:58.818285 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:58.817918 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:58.822499 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:58.822466 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:35:58.841010 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:35:58.840981 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:36:04.951555 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.951518 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2"] Apr 16 08:36:04.957633 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.957608 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:04.960058 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.960036 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 08:36:04.960229 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.960035 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 08:36:04.960356 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.960040 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 08:36:04.960356 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.960073 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 08:36:04.961091 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.961074 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-6298c\"" Apr 16 08:36:04.961285 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.961119 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 08:36:04.963102 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.963079 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:04.963295 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.963279 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:04.963445 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.963431 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce9b851-bc26-4187-b2c8-4da5028f14b8-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:04.963565 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.963552 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzqn6\" (UniqueName: \"kubernetes.io/projected/1ce9b851-bc26-4187-b2c8-4da5028f14b8-kube-api-access-vzqn6\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:04.963845 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.963822 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2"] Apr 16 08:36:04.967112 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.967091 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-68mtk"] Apr 16 08:36:04.971041 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.971020 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:04.973586 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.973568 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 08:36:04.973795 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.973650 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 08:36:04.974086 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.974070 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wmxc2\"" Apr 16 08:36:04.974267 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.974249 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 08:36:04.983061 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.983042 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-zzjzh"] Apr 16 08:36:04.986643 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.986614 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-68mtk"] Apr 16 08:36:04.986759 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.986746 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:04.989157 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.989127 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 08:36:04.989433 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.989409 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 08:36:04.989776 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.989761 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 08:36:04.989969 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:04.989917 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ggsvg\"" Apr 16 08:36:05.063959 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.063934 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-sys\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064116 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.063967 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.064116 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064043 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9587686e-6131-4295-8530-2461b71a63c0-metrics-client-ca\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064116 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064062 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-textfile\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064116 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064082 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-wtmp\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064138 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.064315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064174 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064207 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.064315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064232 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-root\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064257 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpbh\" (UniqueName: \"kubernetes.io/projected/9587686e-6131-4295-8530-2461b71a63c0-kube-api-access-kbpbh\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064277 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.064315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064312 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-tls\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064343 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064369 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:36:05.064383 2535 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064400 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064427 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvdwb\" (UniqueName: \"kubernetes.io/projected/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-api-access-nvdwb\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:36:05.064448 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-tls podName:1ce9b851-bc26-4187-b2c8-4da5028f14b8 nodeName:}" failed. No retries permitted until 2026-04-16 08:36:05.564430819 +0000 UTC m=+193.955285044 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-kzhf2" (UID: "1ce9b851-bc26-4187-b2c8-4da5028f14b8") : secret "openshift-state-metrics-tls" not found Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064478 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-accelerators-collector-config\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064523 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce9b851-bc26-4187-b2c8-4da5028f14b8-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.064684 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.064546 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzqn6\" (UniqueName: \"kubernetes.io/projected/1ce9b851-bc26-4187-b2c8-4da5028f14b8-kube-api-access-vzqn6\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.065259 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.065236 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ce9b851-bc26-4187-b2c8-4da5028f14b8-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.066925 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.066905 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.075155 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.075132 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzqn6\" (UniqueName: \"kubernetes.io/projected/1ce9b851-bc26-4187-b2c8-4da5028f14b8-kube-api-access-vzqn6\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.165581 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165547 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.165745 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165591 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nvdwb\" (UniqueName: \"kubernetes.io/projected/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-api-access-nvdwb\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.165745 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165620 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-accelerators-collector-config\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165745 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165668 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-sys\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165745 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165702 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.165745 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165733 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9587686e-6131-4295-8530-2461b71a63c0-metrics-client-ca\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165761 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-textfile\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165786 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-wtmp\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165821 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165864 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165889 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-root\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165912 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpbh\" (UniqueName: \"kubernetes.io/projected/9587686e-6131-4295-8530-2461b71a63c0-kube-api-access-kbpbh\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165965 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-tls\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165966 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165977 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-wtmp\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.165999 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165994 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.166507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.166031 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.166507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.166288 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-accelerators-collector-config\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.166507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.166327 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9587686e-6131-4295-8530-2461b71a63c0-metrics-client-ca\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.166507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.166363 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-root\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.166724 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.166680 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.166808 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:36:05.166778 2535 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 08:36:05.166874 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:36:05.166863 2535 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-tls podName:9587686e-6131-4295-8530-2461b71a63c0 nodeName:}" failed. No retries permitted until 2026-04-16 08:36:05.666840431 +0000 UTC m=+194.057694655 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-tls") pod "node-exporter-zzjzh" (UID: "9587686e-6131-4295-8530-2461b71a63c0") : secret "node-exporter-tls" not found Apr 16 08:36:05.166874 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.166865 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-textfile\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.167056 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.167030 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.167147 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.165771 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9587686e-6131-4295-8530-2461b71a63c0-sys\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.168640 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.168609 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.168885 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.168863 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.169748 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.169725 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.180968 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.180945 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpbh\" (UniqueName: \"kubernetes.io/projected/9587686e-6131-4295-8530-2461b71a63c0-kube-api-access-kbpbh\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.181206 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.181186 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvdwb\" (UniqueName: \"kubernetes.io/projected/39bc34c3-d7f5-4497-a3ca-f2a44cf292f8-kube-api-access-nvdwb\") pod \"kube-state-metrics-7479c89684-68mtk\" (UID: \"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.283255 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.283186 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" Apr 16 08:36:05.409004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.408972 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-68mtk"] Apr 16 08:36:05.411993 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:36:05.411960 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39bc34c3_d7f5_4497_a3ca_f2a44cf292f8.slice/crio-51b79acc75860b4615f1ddd895832e5ab9a958202cfc77f3dc7c03dcf7aafb9d WatchSource:0}: Error finding container 51b79acc75860b4615f1ddd895832e5ab9a958202cfc77f3dc7c03dcf7aafb9d: Status 404 returned error can't find the container with id 51b79acc75860b4615f1ddd895832e5ab9a958202cfc77f3dc7c03dcf7aafb9d Apr 16 08:36:05.569689 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.569615 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.571886 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.571868 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1ce9b851-bc26-4187-b2c8-4da5028f14b8-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-kzhf2\" (UID: \"1ce9b851-bc26-4187-b2c8-4da5028f14b8\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.670450 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.670417 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-tls\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.672586 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.672569 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9587686e-6131-4295-8530-2461b71a63c0-node-exporter-tls\") pod \"node-exporter-zzjzh\" (UID: \"9587686e-6131-4295-8530-2461b71a63c0\") " pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.678778 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.678755 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bb65bb9d6-bsh82"] Apr 16 08:36:05.855165 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.855081 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" event={"ID":"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8","Type":"ContainerStarted","Data":"51b79acc75860b4615f1ddd895832e5ab9a958202cfc77f3dc7c03dcf7aafb9d"} Apr 16 08:36:05.869420 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.869390 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" Apr 16 08:36:05.898960 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:05.898926 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-zzjzh" Apr 16 08:36:05.908188 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:36:05.908157 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9587686e_6131_4295_8530_2461b71a63c0.slice/crio-5a9c003a57fb693bf1bdad323f5eb63531c7fd6a2e13e58a9ccc92df9f469076 WatchSource:0}: Error finding container 5a9c003a57fb693bf1bdad323f5eb63531c7fd6a2e13e58a9ccc92df9f469076: Status 404 returned error can't find the container with id 5a9c003a57fb693bf1bdad323f5eb63531c7fd6a2e13e58a9ccc92df9f469076 Apr 16 08:36:06.019503 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.019449 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2"] Apr 16 08:36:06.022933 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:36:06.022901 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ce9b851_bc26_4187_b2c8_4da5028f14b8.slice/crio-bd5f82f510e1ad9695322d7d12d555b8126e0824e621777f0767b1abf461a676 WatchSource:0}: Error finding container bd5f82f510e1ad9695322d7d12d555b8126e0824e621777f0767b1abf461a676: Status 404 returned error can't find the container with id bd5f82f510e1ad9695322d7d12d555b8126e0824e621777f0767b1abf461a676 Apr 16 08:36:06.864261 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.864182 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zzjzh" event={"ID":"9587686e-6131-4295-8530-2461b71a63c0","Type":"ContainerStarted","Data":"5a9c003a57fb693bf1bdad323f5eb63531c7fd6a2e13e58a9ccc92df9f469076"} Apr 16 08:36:06.866130 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.866054 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" event={"ID":"1ce9b851-bc26-4187-b2c8-4da5028f14b8","Type":"ContainerStarted","Data":"4b0d821e7bbe900eed030596a4f01032482a97134485d54339b333b39303bcc2"} Apr 16 08:36:06.866130 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.866093 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" event={"ID":"1ce9b851-bc26-4187-b2c8-4da5028f14b8","Type":"ContainerStarted","Data":"f7826a937972f8963825a1f708defab64367e57cb6d667a2442a1bc457a64763"} Apr 16 08:36:06.866130 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.866109 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" event={"ID":"1ce9b851-bc26-4187-b2c8-4da5028f14b8","Type":"ContainerStarted","Data":"bd5f82f510e1ad9695322d7d12d555b8126e0824e621777f0767b1abf461a676"} Apr 16 08:36:06.868757 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.868667 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" event={"ID":"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8","Type":"ContainerStarted","Data":"4b91e02f45091196f9e9819f91ca6c5efc0204da36a2f23a019c7b1e1ebd6624"} Apr 16 08:36:06.868757 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.868703 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" event={"ID":"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8","Type":"ContainerStarted","Data":"b773c56b2a2a9885aacd173be7aa0ea74e2357250cd1eca3aa74305618ecd945"} Apr 16 08:36:06.868757 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.868717 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" event={"ID":"39bc34c3-d7f5-4497-a3ca-f2a44cf292f8","Type":"ContainerStarted","Data":"955d56181201f9bb9bd22d6341d18aa8fec68a1e535670829c1b5e2b9bad1fc0"} Apr 16 08:36:06.889355 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:06.889287 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-68mtk" podStartSLOduration=1.782339349 podStartE2EDuration="2.889268883s" podCreationTimestamp="2026-04-16 08:36:04 +0000 UTC" firstStartedPulling="2026-04-16 08:36:05.413815134 +0000 UTC m=+193.804669342" lastFinishedPulling="2026-04-16 08:36:06.520744663 +0000 UTC m=+194.911598876" observedRunningTime="2026-04-16 08:36:06.887356522 +0000 UTC m=+195.278210794" watchObservedRunningTime="2026-04-16 08:36:06.889268883 +0000 UTC m=+195.280123117" Apr 16 08:36:07.116855 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:07.116767 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75cbddd885-f99nj"] Apr 16 08:36:07.872884 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:07.872849 2535 generic.go:358] "Generic (PLEG): container finished" podID="9587686e-6131-4295-8530-2461b71a63c0" containerID="40f12bb1c2fb88b253a4ca63911e555756974fdcba565455d4313fdc15ef2811" exitCode=0 Apr 16 08:36:07.873092 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:07.872921 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zzjzh" event={"ID":"9587686e-6131-4295-8530-2461b71a63c0","Type":"ContainerDied","Data":"40f12bb1c2fb88b253a4ca63911e555756974fdcba565455d4313fdc15ef2811"} Apr 16 08:36:07.874724 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:07.874698 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" event={"ID":"1ce9b851-bc26-4187-b2c8-4da5028f14b8","Type":"ContainerStarted","Data":"6eca972c397c3426ea96ea2bfa015f291ca7242a2d4aa59ab458f4d6ac56c436"} Apr 16 08:36:07.909740 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:07.909689 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-kzhf2" podStartSLOduration=2.811622765 podStartE2EDuration="3.909672199s" podCreationTimestamp="2026-04-16 08:36:04 +0000 UTC" firstStartedPulling="2026-04-16 08:36:06.169755666 +0000 UTC m=+194.560609877" lastFinishedPulling="2026-04-16 08:36:07.267805101 +0000 UTC m=+195.658659311" observedRunningTime="2026-04-16 08:36:07.908741265 +0000 UTC m=+196.299595519" watchObservedRunningTime="2026-04-16 08:36:07.909672199 +0000 UTC m=+196.300526428" Apr 16 08:36:08.882820 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:08.882788 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zzjzh" event={"ID":"9587686e-6131-4295-8530-2461b71a63c0","Type":"ContainerStarted","Data":"c683392ed8bb872de11289fe61b77654ad1d2b3c4666d43466e2675bcef6b68b"} Apr 16 08:36:08.882820 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:08.882825 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-zzjzh" event={"ID":"9587686e-6131-4295-8530-2461b71a63c0","Type":"ContainerStarted","Data":"b0fc24edebf798ec5305ac94230070478270f7fc73a81def8147c57dbf783e77"} Apr 16 08:36:08.909432 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:08.909366 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-zzjzh" podStartSLOduration=3.86215866 podStartE2EDuration="4.909352701s" podCreationTimestamp="2026-04-16 08:36:04 +0000 UTC" firstStartedPulling="2026-04-16 08:36:05.910558679 +0000 UTC m=+194.301412888" lastFinishedPulling="2026-04-16 08:36:06.957752718 +0000 UTC m=+195.348606929" observedRunningTime="2026-04-16 08:36:08.909194073 +0000 UTC m=+197.300048305" watchObservedRunningTime="2026-04-16 08:36:08.909352701 +0000 UTC m=+197.300206932" Apr 16 08:36:09.768185 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.768154 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-857857768-cxrw2"] Apr 16 08:36:09.771272 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.771253 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.778103 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.778081 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 08:36:09.784364 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.784344 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857857768-cxrw2"] Apr 16 08:36:09.802887 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.802867 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-oauth-config\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.802990 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.802896 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw7jw\" (UniqueName: \"kubernetes.io/projected/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-kube-api-access-zw7jw\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.802990 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.802918 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-serving-cert\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.802990 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.802981 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-oauth-serving-cert\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.803111 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.803018 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-service-ca\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.803111 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.803069 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-config\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.803111 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.803100 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-trusted-ca-bundle\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.903643 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.903615 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-oauth-config\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904074 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.903649 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw7jw\" (UniqueName: \"kubernetes.io/projected/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-kube-api-access-zw7jw\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904074 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.903692 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-serving-cert\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904074 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.903844 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-oauth-serving-cert\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904258 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.904148 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-service-ca\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904258 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.904235 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-config\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904378 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.904266 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-trusted-ca-bundle\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904643 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.904619 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-oauth-serving-cert\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904775 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.904740 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-service-ca\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.904899 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.904883 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-config\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.905198 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.905160 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-trusted-ca-bundle\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.906152 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.906134 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-oauth-config\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.906277 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.906259 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-serving-cert\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:09.912261 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:09.912237 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw7jw\" (UniqueName: \"kubernetes.io/projected/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-kube-api-access-zw7jw\") pod \"console-857857768-cxrw2\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:10.080693 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:10.080602 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:10.195001 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:10.194969 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-857857768-cxrw2"] Apr 16 08:36:10.198031 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:36:10.197998 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb30018fa_2aab_4a5f_a2e3_d2c3b7f212eb.slice/crio-cc2a44c1dcb272289cf108b85ec804d0e9517b11a8cdaf195a79050048349964 WatchSource:0}: Error finding container cc2a44c1dcb272289cf108b85ec804d0e9517b11a8cdaf195a79050048349964: Status 404 returned error can't find the container with id cc2a44c1dcb272289cf108b85ec804d0e9517b11a8cdaf195a79050048349964 Apr 16 08:36:10.889372 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:10.889336 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857768-cxrw2" event={"ID":"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb","Type":"ContainerStarted","Data":"6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7"} Apr 16 08:36:10.889372 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:10.889376 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857768-cxrw2" event={"ID":"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb","Type":"ContainerStarted","Data":"cc2a44c1dcb272289cf108b85ec804d0e9517b11a8cdaf195a79050048349964"} Apr 16 08:36:10.908439 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:10.908396 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-857857768-cxrw2" podStartSLOduration=1.9083821090000002 podStartE2EDuration="1.908382109s" podCreationTimestamp="2026-04-16 08:36:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 08:36:10.907315011 +0000 UTC m=+199.298169242" watchObservedRunningTime="2026-04-16 08:36:10.908382109 +0000 UTC m=+199.299236340" Apr 16 08:36:11.268640 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.268565 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 08:36:11.272164 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.272146 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.274933 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.274908 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 08:36:11.275031 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.274941 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 08:36:11.275031 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.274969 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 08:36:11.275031 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275019 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 08:36:11.275195 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275036 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 08:36:11.275546 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275368 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 08:36:11.275546 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275387 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-7drf5\"" Apr 16 08:36:11.275546 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275388 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 08:36:11.275546 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275411 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 08:36:11.275546 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275401 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8slqdgjusean9\"" Apr 16 08:36:11.275546 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275455 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 08:36:11.275546 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.275417 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 08:36:11.276097 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.276082 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 08:36:11.276559 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.276542 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 08:36:11.278530 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.278510 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 08:36:11.285223 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.285202 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 08:36:11.313855 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.313829 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jb5l\" (UniqueName: \"kubernetes.io/projected/cce5ce63-f276-486b-8ac1-1a65e75b99bd-kube-api-access-2jb5l\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.313855 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.313856 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.313978 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.313874 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.313978 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.313893 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.313978 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.313913 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cce5ce63-f276-486b-8ac1-1a65e75b99bd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314072 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.313972 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314072 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314034 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314130 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314074 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314130 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314095 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314130 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314111 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-web-config\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314218 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314141 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314218 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314161 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-config\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314218 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314175 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314218 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314203 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314413 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314300 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314413 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314357 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314536 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314427 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cce5ce63-f276-486b-8ac1-1a65e75b99bd-config-out\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.314536 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.314463 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.415768 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415733 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.415768 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415770 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415792 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cce5ce63-f276-486b-8ac1-1a65e75b99bd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415810 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415839 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415881 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415897 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415916 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-web-config\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415943 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415968 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-config\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416004 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.415991 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416023 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416055 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416085 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416113 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cce5ce63-f276-486b-8ac1-1a65e75b99bd-config-out\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416137 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416198 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jb5l\" (UniqueName: \"kubernetes.io/projected/cce5ce63-f276-486b-8ac1-1a65e75b99bd-kube-api-access-2jb5l\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416431 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416228 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416795 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416716 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.416911 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.416884 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.418842 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/cce5ce63-f276-486b-8ac1-1a65e75b99bd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.419084 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.419101 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.419402 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.419458 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.419664 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.419769 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/cce5ce63-f276-486b-8ac1-1a65e75b99bd-config-out\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.420038 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.420113 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420359 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.420319 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cce5ce63-f276-486b-8ac1-1a65e75b99bd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.420961 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.420930 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.421549 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.421528 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-web-config\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.421693 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.421675 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-config\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.421758 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.421745 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.421842 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.421825 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/cce5ce63-f276-486b-8ac1-1a65e75b99bd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.431561 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.431541 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jb5l\" (UniqueName: \"kubernetes.io/projected/cce5ce63-f276-486b-8ac1-1a65e75b99bd-kube-api-access-2jb5l\") pod \"prometheus-k8s-0\" (UID: \"cce5ce63-f276-486b-8ac1-1a65e75b99bd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.582206 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.582125 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:11.703657 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.703625 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 08:36:11.706179 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:36:11.706152 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce5ce63_f276_486b_8ac1_1a65e75b99bd.slice/crio-4f6e7132df166e910442afe0409b9a1f70d3680ef2b5c288ece3bfbb81575930 WatchSource:0}: Error finding container 4f6e7132df166e910442afe0409b9a1f70d3680ef2b5c288ece3bfbb81575930: Status 404 returned error can't find the container with id 4f6e7132df166e910442afe0409b9a1f70d3680ef2b5c288ece3bfbb81575930 Apr 16 08:36:11.892556 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:11.892524 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerStarted","Data":"4f6e7132df166e910442afe0409b9a1f70d3680ef2b5c288ece3bfbb81575930"} Apr 16 08:36:12.898023 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:12.897996 2535 generic.go:358] "Generic (PLEG): container finished" podID="cce5ce63-f276-486b-8ac1-1a65e75b99bd" containerID="a9aa3e1fd7c3df1e1f62c7b4be28de8e90eddff3f4c9b1af389b512743355e8e" exitCode=0 Apr 16 08:36:12.898395 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:12.898054 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerDied","Data":"a9aa3e1fd7c3df1e1f62c7b4be28de8e90eddff3f4c9b1af389b512743355e8e"} Apr 16 08:36:15.908147 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:15.908113 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerStarted","Data":"0606e255b6379268bdde9691cfcf739d72c76e2fd9e5d3b7d9f2dab8a1594c52"} Apr 16 08:36:15.908147 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:15.908146 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerStarted","Data":"c670ffc3ddfc307f9cf043daba07d206c61e2b1b1d5e453adf2bccf0eefa9496"} Apr 16 08:36:17.916950 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:17.916914 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerStarted","Data":"54c01b59bb08dc4251d61df6d49361a0189b2f6a9badb5e61b11dd40c7a1b601"} Apr 16 08:36:17.916950 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:17.916956 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerStarted","Data":"92045fca2b56fc14499fffcc62c4a63aae3b7bd629403ce16363342d13458d16"} Apr 16 08:36:17.917353 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:17.916970 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerStarted","Data":"bbec94a0b923d2d54313c2fba6321e137ceb372ba61449ee2c7e8178d5c0c9c5"} Apr 16 08:36:17.917353 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:17.916982 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"cce5ce63-f276-486b-8ac1-1a65e75b99bd","Type":"ContainerStarted","Data":"adfc01c72d520c816ceaa1d4390f69844886ec56015e00d502130fed64fe0487"} Apr 16 08:36:17.947351 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:17.947284 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.455238555 podStartE2EDuration="6.947269933s" podCreationTimestamp="2026-04-16 08:36:11 +0000 UTC" firstStartedPulling="2026-04-16 08:36:11.708022849 +0000 UTC m=+200.098877057" lastFinishedPulling="2026-04-16 08:36:17.200054227 +0000 UTC m=+205.590908435" observedRunningTime="2026-04-16 08:36:17.945627109 +0000 UTC m=+206.336481381" watchObservedRunningTime="2026-04-16 08:36:17.947269933 +0000 UTC m=+206.338124158" Apr 16 08:36:20.081457 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:20.081420 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:20.081457 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:20.081462 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:20.086243 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:20.086222 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:20.932887 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:20.932855 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:36:21.583149 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:21.583117 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:36:30.697207 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.697168 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" podUID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" containerName="registry" containerID="cri-o://b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8" gracePeriod=30 Apr 16 08:36:30.926692 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.926672 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:36:30.953840 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.953766 2535 generic.go:358] "Generic (PLEG): container finished" podID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" containerID="b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8" exitCode=0 Apr 16 08:36:30.953944 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.953835 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" Apr 16 08:36:30.953944 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.953838 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" event={"ID":"0c25e594-c9c3-4fc1-a428-ee29b551fad1","Type":"ContainerDied","Data":"b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8"} Apr 16 08:36:30.954023 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.953945 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-bb65bb9d6-bsh82" event={"ID":"0c25e594-c9c3-4fc1-a428-ee29b551fad1","Type":"ContainerDied","Data":"9b781651079cf7c724fae49721027e90e3b79541946b9eec0fd9d24938057701"} Apr 16 08:36:30.954023 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.953965 2535 scope.go:117] "RemoveContainer" containerID="b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8" Apr 16 08:36:30.961951 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.961934 2535 scope.go:117] "RemoveContainer" containerID="b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8" Apr 16 08:36:30.962213 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:36:30.962195 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8\": container with ID starting with b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8 not found: ID does not exist" containerID="b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8" Apr 16 08:36:30.962280 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.962224 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8"} err="failed to get container status \"b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8\": rpc error: code = NotFound desc = could not find container \"b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8\": container with ID starting with b85b0687df08edab7c12d807ab2f7def39cc459b6847c49b2d6379802e4a9cb8 not found: ID does not exist" Apr 16 08:36:30.978208 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978183 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9h57h\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-kube-api-access-9h57h\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.978300 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978214 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-bound-sa-token\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.978300 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978234 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-installation-pull-secrets\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.978300 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978263 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c25e594-c9c3-4fc1-a428-ee29b551fad1-ca-trust-extracted\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.978466 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978407 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-image-registry-private-configuration\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.978558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978472 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-trusted-ca\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.978558 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978552 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-certificates\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.978656 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978581 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") pod \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\" (UID: \"0c25e594-c9c3-4fc1-a428-ee29b551fad1\") " Apr 16 08:36:30.979069 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.978978 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:30.979069 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.979025 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:30.981002 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.980975 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-kube-api-access-9h57h" (OuterVolumeSpecName: "kube-api-access-9h57h") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "kube-api-access-9h57h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:30.981097 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.981020 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:30.981159 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.981110 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:30.981159 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.981138 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:30.981253 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.981226 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:30.986849 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:30.986827 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c25e594-c9c3-4fc1-a428-ee29b551fad1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "0c25e594-c9c3-4fc1-a428-ee29b551fad1" (UID: "0c25e594-c9c3-4fc1-a428-ee29b551fad1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 08:36:31.079507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079466 2535 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-installation-pull-secrets\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.079507 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079506 2535 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/0c25e594-c9c3-4fc1-a428-ee29b551fad1-ca-trust-extracted\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.079661 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079517 2535 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/0c25e594-c9c3-4fc1-a428-ee29b551fad1-image-registry-private-configuration\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.079661 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079527 2535 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-trusted-ca\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.079661 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079537 2535 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-certificates\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.079661 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079546 2535 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-registry-tls\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.079661 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079555 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9h57h\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-kube-api-access-9h57h\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.079661 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.079563 2535 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0c25e594-c9c3-4fc1-a428-ee29b551fad1-bound-sa-token\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:31.276187 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.276152 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-bb65bb9d6-bsh82"] Apr 16 08:36:31.280428 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:31.280404 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-bb65bb9d6-bsh82"] Apr 16 08:36:32.138981 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.138924 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-75cbddd885-f99nj" podUID="701168b5-5d8e-4798-849a-64ab0e79bf80" containerName="console" containerID="cri-o://ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637" gracePeriod=15 Apr 16 08:36:32.258476 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.258443 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" path="/var/lib/kubelet/pods/0c25e594-c9c3-4fc1-a428-ee29b551fad1/volumes" Apr 16 08:36:32.378553 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.378532 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75cbddd885-f99nj_701168b5-5d8e-4798-849a-64ab0e79bf80/console/0.log" Apr 16 08:36:32.378668 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.378587 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:36:32.491895 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.491812 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgw5r\" (UniqueName: \"kubernetes.io/projected/701168b5-5d8e-4798-849a-64ab0e79bf80-kube-api-access-qgw5r\") pod \"701168b5-5d8e-4798-849a-64ab0e79bf80\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " Apr 16 08:36:32.491895 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.491867 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-oauth-serving-cert\") pod \"701168b5-5d8e-4798-849a-64ab0e79bf80\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " Apr 16 08:36:32.491895 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.491889 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-oauth-config\") pod \"701168b5-5d8e-4798-849a-64ab0e79bf80\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " Apr 16 08:36:32.492103 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.491923 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-service-ca\") pod \"701168b5-5d8e-4798-849a-64ab0e79bf80\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " Apr 16 08:36:32.492103 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.491943 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-serving-cert\") pod \"701168b5-5d8e-4798-849a-64ab0e79bf80\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " Apr 16 08:36:32.492103 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.491968 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-console-config\") pod \"701168b5-5d8e-4798-849a-64ab0e79bf80\" (UID: \"701168b5-5d8e-4798-849a-64ab0e79bf80\") " Apr 16 08:36:32.492435 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.492401 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-console-config" (OuterVolumeSpecName: "console-config") pod "701168b5-5d8e-4798-849a-64ab0e79bf80" (UID: "701168b5-5d8e-4798-849a-64ab0e79bf80"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:32.492435 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.492416 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-service-ca" (OuterVolumeSpecName: "service-ca") pod "701168b5-5d8e-4798-849a-64ab0e79bf80" (UID: "701168b5-5d8e-4798-849a-64ab0e79bf80"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:32.492435 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.492427 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "701168b5-5d8e-4798-849a-64ab0e79bf80" (UID: "701168b5-5d8e-4798-849a-64ab0e79bf80"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:36:32.494144 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.494122 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "701168b5-5d8e-4798-849a-64ab0e79bf80" (UID: "701168b5-5d8e-4798-849a-64ab0e79bf80"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:32.494240 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.494165 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "701168b5-5d8e-4798-849a-64ab0e79bf80" (UID: "701168b5-5d8e-4798-849a-64ab0e79bf80"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:36:32.494240 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.494205 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701168b5-5d8e-4798-849a-64ab0e79bf80-kube-api-access-qgw5r" (OuterVolumeSpecName: "kube-api-access-qgw5r") pod "701168b5-5d8e-4798-849a-64ab0e79bf80" (UID: "701168b5-5d8e-4798-849a-64ab0e79bf80"). InnerVolumeSpecName "kube-api-access-qgw5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:36:32.592984 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.592945 2535 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-oauth-serving-cert\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:32.592984 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.592978 2535 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-oauth-config\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:32.592984 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.592988 2535 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-service-ca\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:32.592984 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.592998 2535 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/701168b5-5d8e-4798-849a-64ab0e79bf80-console-serving-cert\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:32.593234 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.593007 2535 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/701168b5-5d8e-4798-849a-64ab0e79bf80-console-config\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:32.593234 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.593016 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgw5r\" (UniqueName: \"kubernetes.io/projected/701168b5-5d8e-4798-849a-64ab0e79bf80-kube-api-access-qgw5r\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:36:32.961334 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.961306 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-75cbddd885-f99nj_701168b5-5d8e-4798-849a-64ab0e79bf80/console/0.log" Apr 16 08:36:32.961508 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.961343 2535 generic.go:358] "Generic (PLEG): container finished" podID="701168b5-5d8e-4798-849a-64ab0e79bf80" containerID="ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637" exitCode=2 Apr 16 08:36:32.961508 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.961407 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-75cbddd885-f99nj" Apr 16 08:36:32.961508 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.961426 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cbddd885-f99nj" event={"ID":"701168b5-5d8e-4798-849a-64ab0e79bf80","Type":"ContainerDied","Data":"ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637"} Apr 16 08:36:32.961508 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.961451 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-75cbddd885-f99nj" event={"ID":"701168b5-5d8e-4798-849a-64ab0e79bf80","Type":"ContainerDied","Data":"94ece12427c730469fce497f9505f0a237cdc9bac905b35f63b4b15edb8ed5c2"} Apr 16 08:36:32.961508 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.961465 2535 scope.go:117] "RemoveContainer" containerID="ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637" Apr 16 08:36:32.970129 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.970086 2535 scope.go:117] "RemoveContainer" containerID="ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637" Apr 16 08:36:32.970362 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:36:32.970343 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637\": container with ID starting with ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637 not found: ID does not exist" containerID="ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637" Apr 16 08:36:32.970424 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.970372 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637"} err="failed to get container status \"ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637\": rpc error: code = NotFound desc = could not find container \"ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637\": container with ID starting with ab02feec9ffb7d3220ff9c42fb60d179ff88ab1a88a09c0ffd81520e26cd0637 not found: ID does not exist" Apr 16 08:36:32.983938 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.983915 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-75cbddd885-f99nj"] Apr 16 08:36:32.988859 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:32.988834 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-75cbddd885-f99nj"] Apr 16 08:36:34.257372 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:36:34.257328 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701168b5-5d8e-4798-849a-64ab0e79bf80" path="/var/lib/kubelet/pods/701168b5-5d8e-4798-849a-64ab0e79bf80/volumes" Apr 16 08:37:03.048588 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:03.048555 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:37:03.050775 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:03.050756 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7146c66d-f9a6-4b2c-8f79-e72ee1b00021-metrics-certs\") pod \"network-metrics-daemon-88lvl\" (UID: \"7146c66d-f9a6-4b2c-8f79-e72ee1b00021\") " pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:37:03.056959 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:03.056938 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kd4nk\"" Apr 16 08:37:03.065781 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:03.065767 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-88lvl" Apr 16 08:37:03.181582 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:03.181549 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-88lvl"] Apr 16 08:37:03.184242 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:37:03.184213 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7146c66d_f9a6_4b2c_8f79_e72ee1b00021.slice/crio-e1b8f71fba89b37f6eca6d777b0c770cb2dcd428f204d24b45d45cfbbc5f6223 WatchSource:0}: Error finding container e1b8f71fba89b37f6eca6d777b0c770cb2dcd428f204d24b45d45cfbbc5f6223: Status 404 returned error can't find the container with id e1b8f71fba89b37f6eca6d777b0c770cb2dcd428f204d24b45d45cfbbc5f6223 Apr 16 08:37:04.050781 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:04.050738 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-88lvl" event={"ID":"7146c66d-f9a6-4b2c-8f79-e72ee1b00021","Type":"ContainerStarted","Data":"e1b8f71fba89b37f6eca6d777b0c770cb2dcd428f204d24b45d45cfbbc5f6223"} Apr 16 08:37:05.054880 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:05.054847 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-88lvl" event={"ID":"7146c66d-f9a6-4b2c-8f79-e72ee1b00021","Type":"ContainerStarted","Data":"8586f383a35bf3053e96a258f21c13634194b05c6a79811e158cdb964580251a"} Apr 16 08:37:05.054880 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:05.054883 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-88lvl" event={"ID":"7146c66d-f9a6-4b2c-8f79-e72ee1b00021","Type":"ContainerStarted","Data":"4ee21f6f1692e6d5e192f7b3b5bd5643ba8ed7ee1baefc57c16e7af54d424686"} Apr 16 08:37:05.072112 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:05.072057 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-88lvl" podStartSLOduration=252.103341163 podStartE2EDuration="4m13.07203992s" podCreationTimestamp="2026-04-16 08:32:52 +0000 UTC" firstStartedPulling="2026-04-16 08:37:03.186087105 +0000 UTC m=+251.576941313" lastFinishedPulling="2026-04-16 08:37:04.154785858 +0000 UTC m=+252.545640070" observedRunningTime="2026-04-16 08:37:05.071764156 +0000 UTC m=+253.462618399" watchObservedRunningTime="2026-04-16 08:37:05.07203992 +0000 UTC m=+253.462894151" Apr 16 08:37:11.583060 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:11.582960 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:37:11.602898 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:11.602873 2535 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:37:12.089154 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:12.089123 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 08:37:30.743557 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:37:30.743512 2535 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" podUID="73644da9-c64b-4803-b5a4-ff849e2de647" Apr 16 08:37:31.125139 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:31.125062 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:37:34.599227 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:34.599193 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:37:34.601513 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:34.601481 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/73644da9-c64b-4803-b5a4-ff849e2de647-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-frz7j\" (UID: \"73644da9-c64b-4803-b5a4-ff849e2de647\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:37:34.728757 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:34.728724 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-l29ml\"" Apr 16 08:37:34.736842 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:34.736818 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" Apr 16 08:37:34.854412 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:34.854317 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j"] Apr 16 08:37:34.857466 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:37:34.857438 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73644da9_c64b_4803_b5a4_ff849e2de647.slice/crio-c57ea9935ddbac4185d202850f3f8ee3074d56ce964decbb684e850f6f4c172e WatchSource:0}: Error finding container c57ea9935ddbac4185d202850f3f8ee3074d56ce964decbb684e850f6f4c172e: Status 404 returned error can't find the container with id c57ea9935ddbac4185d202850f3f8ee3074d56ce964decbb684e850f6f4c172e Apr 16 08:37:35.136405 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:35.136374 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" event={"ID":"73644da9-c64b-4803-b5a4-ff849e2de647","Type":"ContainerStarted","Data":"c57ea9935ddbac4185d202850f3f8ee3074d56ce964decbb684e850f6f4c172e"} Apr 16 08:37:37.143009 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:37.142977 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" event={"ID":"73644da9-c64b-4803-b5a4-ff849e2de647","Type":"ContainerStarted","Data":"36193044db8cbb7ca447a354c9ab7e02d9f94d5c7f2d7bad52a6a9f00d04a4b8"} Apr 16 08:37:37.163282 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:37.163236 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-frz7j" podStartSLOduration=270.936846518 podStartE2EDuration="4m32.163223818s" podCreationTimestamp="2026-04-16 08:33:05 +0000 UTC" firstStartedPulling="2026-04-16 08:37:34.85975961 +0000 UTC m=+283.250613819" lastFinishedPulling="2026-04-16 08:37:36.0861369 +0000 UTC m=+284.476991119" observedRunningTime="2026-04-16 08:37:37.161473131 +0000 UTC m=+285.552327353" watchObservedRunningTime="2026-04-16 08:37:37.163223818 +0000 UTC m=+285.554078048" Apr 16 08:37:52.128375 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:52.128344 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:37:52.129057 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:52.129037 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:37:52.135315 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:52.135297 2535 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 08:37:52.488449 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:37:52.488406 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857857768-cxrw2"] Apr 16 08:38:17.507108 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.507053 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-857857768-cxrw2" podUID="b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" containerName="console" containerID="cri-o://6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7" gracePeriod=15 Apr 16 08:38:17.738103 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.738080 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857857768-cxrw2_b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb/console/0.log" Apr 16 08:38:17.738228 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.738148 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:38:17.810034 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.809962 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw7jw\" (UniqueName: \"kubernetes.io/projected/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-kube-api-access-zw7jw\") pod \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " Apr 16 08:38:17.810034 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.809994 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-service-ca\") pod \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " Apr 16 08:38:17.810034 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810020 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-serving-cert\") pod \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " Apr 16 08:38:17.810034 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810036 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-config\") pod \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " Apr 16 08:38:17.810354 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810054 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-oauth-serving-cert\") pod \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " Apr 16 08:38:17.810354 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810077 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-trusted-ca-bundle\") pod \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " Apr 16 08:38:17.810354 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810108 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-oauth-config\") pod \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\" (UID: \"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb\") " Apr 16 08:38:17.810535 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810415 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-service-ca" (OuterVolumeSpecName: "service-ca") pod "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" (UID: "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:17.810535 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810438 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" (UID: "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:17.810535 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810451 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-config" (OuterVolumeSpecName: "console-config") pod "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" (UID: "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:17.810688 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.810586 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" (UID: "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 08:38:17.812086 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.812062 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-kube-api-access-zw7jw" (OuterVolumeSpecName: "kube-api-access-zw7jw") pod "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" (UID: "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb"). InnerVolumeSpecName "kube-api-access-zw7jw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 08:38:17.812198 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.812179 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" (UID: "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:38:17.812256 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.812199 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" (UID: "b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 08:38:17.911101 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.911077 2535 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-oauth-config\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:38:17.911101 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.911101 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zw7jw\" (UniqueName: \"kubernetes.io/projected/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-kube-api-access-zw7jw\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:38:17.911251 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.911112 2535 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-service-ca\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:38:17.911251 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.911120 2535 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-serving-cert\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:38:17.911251 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.911130 2535 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-console-config\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:38:17.911251 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.911138 2535 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-oauth-serving-cert\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:38:17.911251 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:17.911145 2535 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb-trusted-ca-bundle\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 08:38:18.252451 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.252426 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-857857768-cxrw2_b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb/console/0.log" Apr 16 08:38:18.252666 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.252464 2535 generic.go:358] "Generic (PLEG): container finished" podID="b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" containerID="6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7" exitCode=2 Apr 16 08:38:18.252666 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.252518 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857768-cxrw2" event={"ID":"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb","Type":"ContainerDied","Data":"6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7"} Apr 16 08:38:18.252666 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.252548 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-857857768-cxrw2" Apr 16 08:38:18.252666 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.252565 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-857857768-cxrw2" event={"ID":"b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb","Type":"ContainerDied","Data":"cc2a44c1dcb272289cf108b85ec804d0e9517b11a8cdaf195a79050048349964"} Apr 16 08:38:18.252666 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.252588 2535 scope.go:117] "RemoveContainer" containerID="6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7" Apr 16 08:38:18.261568 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.261549 2535 scope.go:117] "RemoveContainer" containerID="6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7" Apr 16 08:38:18.261812 ip-10-0-139-144 kubenswrapper[2535]: E0416 08:38:18.261794 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7\": container with ID starting with 6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7 not found: ID does not exist" containerID="6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7" Apr 16 08:38:18.261881 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.261819 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7"} err="failed to get container status \"6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7\": rpc error: code = NotFound desc = could not find container \"6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7\": container with ID starting with 6f3d76b2935dd715cf1eb37249972a525814fa92808e22a2581fa5ebfeb6c4c7 not found: ID does not exist" Apr 16 08:38:18.278230 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.278207 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-857857768-cxrw2"] Apr 16 08:38:18.288210 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:18.288189 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-857857768-cxrw2"] Apr 16 08:38:20.257681 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:20.257651 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" path="/var/lib/kubelet/pods/b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb/volumes" Apr 16 08:38:46.696021 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.695934 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm"] Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696318 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" containerName="console" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696335 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" containerName="console" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696364 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" containerName="registry" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696372 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" containerName="registry" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696390 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="701168b5-5d8e-4798-849a-64ab0e79bf80" containerName="console" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696398 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="701168b5-5d8e-4798-849a-64ab0e79bf80" containerName="console" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696461 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="701168b5-5d8e-4798-849a-64ab0e79bf80" containerName="console" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696472 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="b30018fa-2aab-4a5f-a2e3-d2c3b7f212eb" containerName="console" Apr 16 08:38:46.696566 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.696482 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c25e594-c9c3-4fc1-a428-ee29b551fad1" containerName="registry" Apr 16 08:38:46.700548 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.700528 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.703183 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.703089 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 08:38:46.703183 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.703123 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 08:38:46.703334 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.703123 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 08:38:46.704180 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.704161 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 08:38:46.704258 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.704202 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 08:38:46.704258 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.704226 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 08:38:46.704258 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.704241 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 08:38:46.708614 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.708585 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm"] Apr 16 08:38:46.814576 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.814542 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-ca\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.814576 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.814579 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q56rw\" (UniqueName: \"kubernetes.io/projected/312aa75d-6df9-4414-b0fe-9c815220b044-kube-api-access-q56rw\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.814773 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.814610 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/312aa75d-6df9-4414-b0fe-9c815220b044-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.814773 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.814634 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.814773 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.814653 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-hub\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.814773 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.814675 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.915921 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.915885 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/312aa75d-6df9-4414-b0fe-9c815220b044-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.916069 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.915931 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.916069 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.915966 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-hub\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.916069 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.916001 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.916069 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.916049 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-ca\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.916247 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.916077 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q56rw\" (UniqueName: \"kubernetes.io/projected/312aa75d-6df9-4414-b0fe-9c815220b044-kube-api-access-q56rw\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.916679 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.916652 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/312aa75d-6df9-4414-b0fe-9c815220b044-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.918527 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.918483 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-ca\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.918606 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.918550 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-hub\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.918836 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.918820 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.918954 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.918938 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/312aa75d-6df9-4414-b0fe-9c815220b044-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:46.925253 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:46.925225 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q56rw\" (UniqueName: \"kubernetes.io/projected/312aa75d-6df9-4414-b0fe-9c815220b044-kube-api-access-q56rw\") pod \"cluster-proxy-proxy-agent-774db776c9-hzqxm\" (UID: \"312aa75d-6df9-4414-b0fe-9c815220b044\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:47.016543 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:47.016459 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" Apr 16 08:38:47.144300 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:47.144268 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm"] Apr 16 08:38:47.147001 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:38:47.146971 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod312aa75d_6df9_4414_b0fe_9c815220b044.slice/crio-7e20dc4a01f65f250cd47231d7e25be75f2d7c2d30cc89d90891e80810f2b293 WatchSource:0}: Error finding container 7e20dc4a01f65f250cd47231d7e25be75f2d7c2d30cc89d90891e80810f2b293: Status 404 returned error can't find the container with id 7e20dc4a01f65f250cd47231d7e25be75f2d7c2d30cc89d90891e80810f2b293 Apr 16 08:38:47.149806 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:47.149788 2535 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 08:38:47.328598 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:47.328480 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" event={"ID":"312aa75d-6df9-4414-b0fe-9c815220b044","Type":"ContainerStarted","Data":"7e20dc4a01f65f250cd47231d7e25be75f2d7c2d30cc89d90891e80810f2b293"} Apr 16 08:38:50.338855 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:50.338819 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" event={"ID":"312aa75d-6df9-4414-b0fe-9c815220b044","Type":"ContainerStarted","Data":"433e1c11294f332d871f25d622df96b3e79c900be8d22504ab63eedf14245199"} Apr 16 08:38:52.347452 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:52.347415 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" event={"ID":"312aa75d-6df9-4414-b0fe-9c815220b044","Type":"ContainerStarted","Data":"bffe24a7a1d3de6b9baa1e10ee931436d06565b5f954da086320570cc97f5210"} Apr 16 08:38:52.347452 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:52.347450 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" event={"ID":"312aa75d-6df9-4414-b0fe-9c815220b044","Type":"ContainerStarted","Data":"cc30b9510beaa6751827d4717b26ba3e6dda0b6c5f0c12407ad91f96405e5ba1"} Apr 16 08:38:52.368826 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:38:52.368782 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-774db776c9-hzqxm" podStartSLOduration=1.7196677820000001 podStartE2EDuration="6.368767397s" podCreationTimestamp="2026-04-16 08:38:46 +0000 UTC" firstStartedPulling="2026-04-16 08:38:47.149929115 +0000 UTC m=+355.540783324" lastFinishedPulling="2026-04-16 08:38:51.799028515 +0000 UTC m=+360.189882939" observedRunningTime="2026-04-16 08:38:52.367764709 +0000 UTC m=+360.758618951" watchObservedRunningTime="2026-04-16 08:38:52.368767397 +0000 UTC m=+360.759621628" Apr 16 08:40:45.071405 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.071376 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl"] Apr 16 08:40:45.074281 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.074265 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.076803 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.076779 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 16 08:40:45.077705 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.077684 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 08:40:45.077818 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.077725 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 16 08:40:45.077818 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.077733 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-4wfrn\"" Apr 16 08:40:45.077818 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.077804 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 08:40:45.086543 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.086517 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl"] Apr 16 08:40:45.214447 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.214415 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/c2ec852c-497f-41eb-93b9-8f68218e65a5-kube-api-access-qfpff\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.214611 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.214467 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2ec852c-497f-41eb-93b9-8f68218e65a5-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.214611 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.214546 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c2ec852c-497f-41eb-93b9-8f68218e65a5-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.315455 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.315417 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/c2ec852c-497f-41eb-93b9-8f68218e65a5-kube-api-access-qfpff\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.315652 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.315510 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2ec852c-497f-41eb-93b9-8f68218e65a5-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.315652 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.315545 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c2ec852c-497f-41eb-93b9-8f68218e65a5-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.316248 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.316229 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/c2ec852c-497f-41eb-93b9-8f68218e65a5-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.317924 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.317903 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c2ec852c-497f-41eb-93b9-8f68218e65a5-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.325069 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.325012 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfpff\" (UniqueName: \"kubernetes.io/projected/c2ec852c-497f-41eb-93b9-8f68218e65a5-kube-api-access-qfpff\") pod \"kubeflow-trainer-controller-manager-55f5694779-5dgzl\" (UID: \"c2ec852c-497f-41eb-93b9-8f68218e65a5\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.383616 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.383582 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:45.499773 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.499744 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl"] Apr 16 08:40:45.503086 ip-10-0-139-144 kubenswrapper[2535]: W0416 08:40:45.503062 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc2ec852c_497f_41eb_93b9_8f68218e65a5.slice/crio-591b0f8ecc7701fde060ed2e301f07db3c7820cc81598e1b13f558aeaede37f1 WatchSource:0}: Error finding container 591b0f8ecc7701fde060ed2e301f07db3c7820cc81598e1b13f558aeaede37f1: Status 404 returned error can't find the container with id 591b0f8ecc7701fde060ed2e301f07db3c7820cc81598e1b13f558aeaede37f1 Apr 16 08:40:45.652786 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:45.652738 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" event={"ID":"c2ec852c-497f-41eb-93b9-8f68218e65a5","Type":"ContainerStarted","Data":"591b0f8ecc7701fde060ed2e301f07db3c7820cc81598e1b13f558aeaede37f1"} Apr 16 08:40:48.663978 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:48.663937 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" event={"ID":"c2ec852c-497f-41eb-93b9-8f68218e65a5","Type":"ContainerStarted","Data":"e0cbf80b4140cb02f559f8968216164973339c6b58222c0dfab7b7105529b57f"} Apr 16 08:40:48.664353 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:48.664009 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:40:48.682462 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:40:48.682414 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" podStartSLOduration=1.452344837 podStartE2EDuration="3.682392464s" podCreationTimestamp="2026-04-16 08:40:45 +0000 UTC" firstStartedPulling="2026-04-16 08:40:45.504903162 +0000 UTC m=+473.895757375" lastFinishedPulling="2026-04-16 08:40:47.734950793 +0000 UTC m=+476.125805002" observedRunningTime="2026-04-16 08:40:48.681602036 +0000 UTC m=+477.072456297" watchObservedRunningTime="2026-04-16 08:40:48.682392464 +0000 UTC m=+477.073246694" Apr 16 08:41:04.672280 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:41:04.672251 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-5dgzl" Apr 16 08:42:52.148415 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:42:52.148386 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:42:52.149748 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:42:52.149724 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:47:52.167600 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:47:52.167571 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:47:52.171827 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:47:52.171803 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:52:52.189358 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:52:52.189328 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:52:52.192066 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:52:52.191097 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:57:52.208198 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:57:52.208078 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 08:57:52.212339 ip-10-0-139-144 kubenswrapper[2535]: I0416 08:57:52.210624 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:02:52.229914 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:02:52.229806 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:02:52.232192 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:02:52.232167 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:07:52.249111 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:07:52.249002 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:07:52.253069 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:07:52.251566 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:12:40.230632 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.230598 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vqrq2/must-gather-mr7l5"] Apr 16 09:12:40.234037 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.234015 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.236433 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.236409 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqrq2\"/\"kube-root-ca.crt\"" Apr 16 09:12:40.237290 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.237270 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-vqrq2\"/\"openshift-service-ca.crt\"" Apr 16 09:12:40.237383 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.237273 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-vqrq2\"/\"default-dockercfg-q5rxv\"" Apr 16 09:12:40.249125 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.249106 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqrq2/must-gather-mr7l5"] Apr 16 09:12:40.316677 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.316649 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xbcm\" (UniqueName: \"kubernetes.io/projected/36bf4c82-b360-4d13-a1e2-24317c42ec06-kube-api-access-7xbcm\") pod \"must-gather-mr7l5\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.316824 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.316745 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36bf4c82-b360-4d13-a1e2-24317c42ec06-must-gather-output\") pod \"must-gather-mr7l5\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.417860 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.417828 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36bf4c82-b360-4d13-a1e2-24317c42ec06-must-gather-output\") pod \"must-gather-mr7l5\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.417992 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.417877 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xbcm\" (UniqueName: \"kubernetes.io/projected/36bf4c82-b360-4d13-a1e2-24317c42ec06-kube-api-access-7xbcm\") pod \"must-gather-mr7l5\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.418160 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.418141 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36bf4c82-b360-4d13-a1e2-24317c42ec06-must-gather-output\") pod \"must-gather-mr7l5\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.426392 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.426365 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xbcm\" (UniqueName: \"kubernetes.io/projected/36bf4c82-b360-4d13-a1e2-24317c42ec06-kube-api-access-7xbcm\") pod \"must-gather-mr7l5\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.543511 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.543417 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:12:40.656976 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.656899 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vqrq2/must-gather-mr7l5"] Apr 16 09:12:40.659198 ip-10-0-139-144 kubenswrapper[2535]: W0416 09:12:40.659163 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36bf4c82_b360_4d13_a1e2_24317c42ec06.slice/crio-d391290307610755ae4216473bd8fc04885864ada2e19329e5dd824e341f3151 WatchSource:0}: Error finding container d391290307610755ae4216473bd8fc04885864ada2e19329e5dd824e341f3151: Status 404 returned error can't find the container with id d391290307610755ae4216473bd8fc04885864ada2e19329e5dd824e341f3151 Apr 16 09:12:40.660829 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.660811 2535 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 09:12:40.880633 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:40.880555 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" event={"ID":"36bf4c82-b360-4d13-a1e2-24317c42ec06","Type":"ContainerStarted","Data":"d391290307610755ae4216473bd8fc04885864ada2e19329e5dd824e341f3151"} Apr 16 09:12:45.898807 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:45.898760 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" event={"ID":"36bf4c82-b360-4d13-a1e2-24317c42ec06","Type":"ContainerStarted","Data":"8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c"} Apr 16 09:12:45.898807 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:45.898798 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" event={"ID":"36bf4c82-b360-4d13-a1e2-24317c42ec06","Type":"ContainerStarted","Data":"f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1"} Apr 16 09:12:45.916867 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:45.916822 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" podStartSLOduration=1.209324869 podStartE2EDuration="5.916806125s" podCreationTimestamp="2026-04-16 09:12:40 +0000 UTC" firstStartedPulling="2026-04-16 09:12:40.660989252 +0000 UTC m=+2389.051843467" lastFinishedPulling="2026-04-16 09:12:45.368470499 +0000 UTC m=+2393.759324723" observedRunningTime="2026-04-16 09:12:45.915919885 +0000 UTC m=+2394.306774117" watchObservedRunningTime="2026-04-16 09:12:45.916806125 +0000 UTC m=+2394.307660355" Apr 16 09:12:52.272739 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:52.272621 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:12:52.280278 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:52.274481 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:12:54.913043 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:54.912999 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-5dgzl_c2ec852c-497f-41eb-93b9-8f68218e65a5/manager/0.log" Apr 16 09:12:55.377402 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:55.377333 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-5dgzl_c2ec852c-497f-41eb-93b9-8f68218e65a5/manager/0.log" Apr 16 09:12:55.820809 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:12:55.820780 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-5dgzl_c2ec852c-497f-41eb-93b9-8f68218e65a5/manager/0.log" Apr 16 09:13:31.038563 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:31.038527 2535 generic.go:358] "Generic (PLEG): container finished" podID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerID="f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1" exitCode=0 Apr 16 09:13:31.038563 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:31.038565 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" event={"ID":"36bf4c82-b360-4d13-a1e2-24317c42ec06","Type":"ContainerDied","Data":"f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1"} Apr 16 09:13:31.039041 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:31.038880 2535 scope.go:117] "RemoveContainer" containerID="f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1" Apr 16 09:13:31.327404 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:31.327333 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vqrq2_must-gather-mr7l5_36bf4c82-b360-4d13-a1e2-24317c42ec06/gather/0.log" Apr 16 09:13:34.889114 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:34.889083 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-bxg8l_ef850f58-6397-465e-9a95-6088bf0af066/global-pull-secret-syncer/0.log" Apr 16 09:13:35.064227 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:35.064196 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-v8qrs_cfaaf14a-4a7f-4c15-911d-d1a5a2f0c40f/konnectivity-agent/0.log" Apr 16 09:13:35.137844 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:35.137819 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-144.ec2.internal_118bda75cde6d644622189268cb66453/haproxy/0.log" Apr 16 09:13:36.658760 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.658726 2535 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vqrq2/must-gather-mr7l5"] Apr 16 09:13:36.659216 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.658961 2535 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerName="copy" containerID="cri-o://8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c" gracePeriod=2 Apr 16 09:13:36.663583 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.663557 2535 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vqrq2/must-gather-mr7l5"] Apr 16 09:13:36.884805 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.884786 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vqrq2_must-gather-mr7l5_36bf4c82-b360-4d13-a1e2-24317c42ec06/copy/0.log" Apr 16 09:13:36.885159 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.885145 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:13:36.887252 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.887229 2535 status_manager.go:895] "Failed to get status for pod" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" err="pods \"must-gather-mr7l5\" is forbidden: User \"system:node:ip-10-0-139-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vqrq2\": no relationship found between node 'ip-10-0-139-144.ec2.internal' and this object" Apr 16 09:13:36.985281 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.985228 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xbcm\" (UniqueName: \"kubernetes.io/projected/36bf4c82-b360-4d13-a1e2-24317c42ec06-kube-api-access-7xbcm\") pod \"36bf4c82-b360-4d13-a1e2-24317c42ec06\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " Apr 16 09:13:36.985281 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.985278 2535 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36bf4c82-b360-4d13-a1e2-24317c42ec06-must-gather-output\") pod \"36bf4c82-b360-4d13-a1e2-24317c42ec06\" (UID: \"36bf4c82-b360-4d13-a1e2-24317c42ec06\") " Apr 16 09:13:36.987354 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.987326 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bf4c82-b360-4d13-a1e2-24317c42ec06-kube-api-access-7xbcm" (OuterVolumeSpecName: "kube-api-access-7xbcm") pod "36bf4c82-b360-4d13-a1e2-24317c42ec06" (UID: "36bf4c82-b360-4d13-a1e2-24317c42ec06"). InnerVolumeSpecName "kube-api-access-7xbcm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 09:13:36.987456 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:36.987414 2535 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36bf4c82-b360-4d13-a1e2-24317c42ec06-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "36bf4c82-b360-4d13-a1e2-24317c42ec06" (UID: "36bf4c82-b360-4d13-a1e2-24317c42ec06"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 09:13:37.055534 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.055515 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vqrq2_must-gather-mr7l5_36bf4c82-b360-4d13-a1e2-24317c42ec06/copy/0.log" Apr 16 09:13:37.055797 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.055780 2535 generic.go:358] "Generic (PLEG): container finished" podID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerID="8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c" exitCode=143 Apr 16 09:13:37.055849 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.055824 2535 scope.go:117] "RemoveContainer" containerID="8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c" Apr 16 09:13:37.055849 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.055825 2535 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" Apr 16 09:13:37.059508 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.059469 2535 status_manager.go:895] "Failed to get status for pod" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" err="pods \"must-gather-mr7l5\" is forbidden: User \"system:node:ip-10-0-139-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vqrq2\": no relationship found between node 'ip-10-0-139-144.ec2.internal' and this object" Apr 16 09:13:37.063151 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.063047 2535 scope.go:117] "RemoveContainer" containerID="f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1" Apr 16 09:13:37.065196 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.065170 2535 status_manager.go:895] "Failed to get status for pod" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" pod="openshift-must-gather-vqrq2/must-gather-mr7l5" err="pods \"must-gather-mr7l5\" is forbidden: User \"system:node:ip-10-0-139-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-vqrq2\": no relationship found between node 'ip-10-0-139-144.ec2.internal' and this object" Apr 16 09:13:37.076120 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.076096 2535 scope.go:117] "RemoveContainer" containerID="8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c" Apr 16 09:13:37.076534 ip-10-0-139-144 kubenswrapper[2535]: E0416 09:13:37.076481 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c\": container with ID starting with 8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c not found: ID does not exist" containerID="8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c" Apr 16 09:13:37.076625 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.076543 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c"} err="failed to get container status \"8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c\": rpc error: code = NotFound desc = could not find container \"8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c\": container with ID starting with 8d8c1330f78b9375e3e0beb90ade91149c7a18ec253f456091f7f153d4444f7c not found: ID does not exist" Apr 16 09:13:37.076625 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.076568 2535 scope.go:117] "RemoveContainer" containerID="f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1" Apr 16 09:13:37.076830 ip-10-0-139-144 kubenswrapper[2535]: E0416 09:13:37.076812 2535 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1\": container with ID starting with f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1 not found: ID does not exist" containerID="f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1" Apr 16 09:13:37.076896 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.076834 2535 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1"} err="failed to get container status \"f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1\": rpc error: code = NotFound desc = could not find container \"f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1\": container with ID starting with f2234b75eb9fe0a0f796a276704e057e41a2bd843816086c761ed5519ad7e8d1 not found: ID does not exist" Apr 16 09:13:37.086332 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.086311 2535 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/36bf4c82-b360-4d13-a1e2-24317c42ec06-must-gather-output\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 09:13:37.086414 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:37.086336 2535 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xbcm\" (UniqueName: \"kubernetes.io/projected/36bf4c82-b360-4d13-a1e2-24317c42ec06-kube-api-access-7xbcm\") on node \"ip-10-0-139-144.ec2.internal\" DevicePath \"\"" Apr 16 09:13:38.257205 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:38.257162 2535 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" path="/var/lib/kubelet/pods/36bf4c82-b360-4d13-a1e2-24317c42ec06/volumes" Apr 16 09:13:38.964988 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:38.964963 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-68mtk_39bc34c3-d7f5-4497-a3ca-f2a44cf292f8/kube-state-metrics/0.log" Apr 16 09:13:38.990992 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:38.990974 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-68mtk_39bc34c3-d7f5-4497-a3ca-f2a44cf292f8/kube-rbac-proxy-main/0.log" Apr 16 09:13:39.017130 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.017110 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-68mtk_39bc34c3-d7f5-4497-a3ca-f2a44cf292f8/kube-rbac-proxy-self/0.log" Apr 16 09:13:39.288326 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.288309 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zzjzh_9587686e-6131-4295-8530-2461b71a63c0/node-exporter/0.log" Apr 16 09:13:39.314710 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.314688 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zzjzh_9587686e-6131-4295-8530-2461b71a63c0/kube-rbac-proxy/0.log" Apr 16 09:13:39.336898 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.336881 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-zzjzh_9587686e-6131-4295-8530-2461b71a63c0/init-textfile/0.log" Apr 16 09:13:39.366661 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.366635 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-kzhf2_1ce9b851-bc26-4187-b2c8-4da5028f14b8/kube-rbac-proxy-main/0.log" Apr 16 09:13:39.391347 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.391327 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-kzhf2_1ce9b851-bc26-4187-b2c8-4da5028f14b8/kube-rbac-proxy-self/0.log" Apr 16 09:13:39.418140 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.418117 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-kzhf2_1ce9b851-bc26-4187-b2c8-4da5028f14b8/openshift-state-metrics/0.log" Apr 16 09:13:39.473216 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.473189 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cce5ce63-f276-486b-8ac1-1a65e75b99bd/prometheus/0.log" Apr 16 09:13:39.506106 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.506039 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cce5ce63-f276-486b-8ac1-1a65e75b99bd/config-reloader/0.log" Apr 16 09:13:39.542056 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.542036 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cce5ce63-f276-486b-8ac1-1a65e75b99bd/thanos-sidecar/0.log" Apr 16 09:13:39.585223 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.585202 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cce5ce63-f276-486b-8ac1-1a65e75b99bd/kube-rbac-proxy-web/0.log" Apr 16 09:13:39.614456 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.614436 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cce5ce63-f276-486b-8ac1-1a65e75b99bd/kube-rbac-proxy/0.log" Apr 16 09:13:39.665426 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.665407 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cce5ce63-f276-486b-8ac1-1a65e75b99bd/kube-rbac-proxy-thanos/0.log" Apr 16 09:13:39.705829 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:39.705812 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_cce5ce63-f276-486b-8ac1-1a65e75b99bd/init-config-reloader/0.log" Apr 16 09:13:41.047307 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:41.047285 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-frz7j_73644da9-c64b-4803-b5a4-ff849e2de647/networking-console-plugin/0.log" Apr 16 09:13:42.359148 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.359118 2535 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r"] Apr 16 09:13:42.359517 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.359383 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerName="copy" Apr 16 09:13:42.359517 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.359392 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerName="copy" Apr 16 09:13:42.359517 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.359420 2535 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerName="gather" Apr 16 09:13:42.359517 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.359426 2535 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerName="gather" Apr 16 09:13:42.359517 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.359468 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerName="gather" Apr 16 09:13:42.359517 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.359478 2535 memory_manager.go:356] "RemoveStaleState removing state" podUID="36bf4c82-b360-4d13-a1e2-24317c42ec06" containerName="copy" Apr 16 09:13:42.364388 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.364368 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.366394 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.366376 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sqkd\"/\"kube-root-ca.crt\"" Apr 16 09:13:42.367407 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.367283 2535 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6sqkd\"/\"openshift-service-ca.crt\"" Apr 16 09:13:42.367523 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.367508 2535 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6sqkd\"/\"default-dockercfg-kg5b9\"" Apr 16 09:13:42.372176 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.372158 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r"] Apr 16 09:13:42.423782 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.423760 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-lib-modules\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.423888 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.423802 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-podres\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.423888 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.423830 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k85j\" (UniqueName: \"kubernetes.io/projected/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-kube-api-access-5k85j\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.423888 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.423878 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-proc\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.424014 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.423922 2535 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-sys\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.524962 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.524940 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-sys\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525089 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.524976 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-lib-modules\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525089 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.525051 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-sys\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525168 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.525096 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-podres\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525168 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.525105 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-lib-modules\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525168 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.525127 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5k85j\" (UniqueName: \"kubernetes.io/projected/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-kube-api-access-5k85j\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525258 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.525174 2535 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-proc\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525258 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.525197 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-podres\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.525317 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.525307 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-proc\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.533446 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.533425 2535 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k85j\" (UniqueName: \"kubernetes.io/projected/a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2-kube-api-access-5k85j\") pod \"perf-node-gather-daemonset-rwc5r\" (UID: \"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2\") " pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.674625 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.674605 2535 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:42.790352 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:42.790231 2535 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r"] Apr 16 09:13:42.793078 ip-10-0-139-144 kubenswrapper[2535]: W0416 09:13:42.793048 2535 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda4f9f1f1_71a8_4eb2_9e7a_582d2d02d0d2.slice/crio-c8b092c2e78ba0dc28145e3b47012badfb4cce915efba9cdca8c6a4addb5fded WatchSource:0}: Error finding container c8b092c2e78ba0dc28145e3b47012badfb4cce915efba9cdca8c6a4addb5fded: Status 404 returned error can't find the container with id c8b092c2e78ba0dc28145e3b47012badfb4cce915efba9cdca8c6a4addb5fded Apr 16 09:13:43.073958 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.073876 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" event={"ID":"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2","Type":"ContainerStarted","Data":"23af4357923ad9d5c07b89a9087b035d429843ef92aae846695c200abf367f0b"} Apr 16 09:13:43.073958 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.073911 2535 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" event={"ID":"a4f9f1f1-71a8-4eb2-9e7a-582d2d02d0d2","Type":"ContainerStarted","Data":"c8b092c2e78ba0dc28145e3b47012badfb4cce915efba9cdca8c6a4addb5fded"} Apr 16 09:13:43.074166 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.074003 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:43.090630 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.090594 2535 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" podStartSLOduration=1.090582825 podStartE2EDuration="1.090582825s" podCreationTimestamp="2026-04-16 09:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 09:13:43.089707954 +0000 UTC m=+2451.480562186" watchObservedRunningTime="2026-04-16 09:13:43.090582825 +0000 UTC m=+2451.481437056" Apr 16 09:13:43.127749 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.127729 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tzcjj_37aa253a-35df-4129-a89c-8aff6799646f/dns/0.log" Apr 16 09:13:43.151651 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.151631 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tzcjj_37aa253a-35df-4129-a89c-8aff6799646f/kube-rbac-proxy/0.log" Apr 16 09:13:43.246187 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.246170 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-clvxz_72c8255d-1385-43a9-b20b-b7dfd0472d12/dns-node-resolver/0.log" Apr 16 09:13:43.752552 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:43.752523 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-c4ztq_7efe070a-8151-4485-9a0a-23daecd1d21c/node-ca/0.log" Apr 16 09:13:44.889600 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:44.889573 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hgxtv_35c727f2-1ff4-4364-a35e-46305564bbb8/serve-healthcheck-canary/0.log" Apr 16 09:13:45.308662 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:45.308591 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2j9jj_8fb5ded9-9208-45a5-92ea-b444e485f55b/kube-rbac-proxy/0.log" Apr 16 09:13:45.331017 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:45.330994 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2j9jj_8fb5ded9-9208-45a5-92ea-b444e485f55b/exporter/0.log" Apr 16 09:13:45.352442 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:45.352416 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2j9jj_8fb5ded9-9208-45a5-92ea-b444e485f55b/extractor/0.log" Apr 16 09:13:49.086111 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:49.086080 2535 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6sqkd/perf-node-gather-daemonset-rwc5r" Apr 16 09:13:52.670976 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.670950 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4m4l_c79b1bf6-7559-4369-a195-27e8876dde6f/kube-multus-additional-cni-plugins/0.log" Apr 16 09:13:52.694259 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.694234 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4m4l_c79b1bf6-7559-4369-a195-27e8876dde6f/egress-router-binary-copy/0.log" Apr 16 09:13:52.717310 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.717286 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4m4l_c79b1bf6-7559-4369-a195-27e8876dde6f/cni-plugins/0.log" Apr 16 09:13:52.739838 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.739810 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4m4l_c79b1bf6-7559-4369-a195-27e8876dde6f/bond-cni-plugin/0.log" Apr 16 09:13:52.764966 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.764940 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4m4l_c79b1bf6-7559-4369-a195-27e8876dde6f/routeoverride-cni/0.log" Apr 16 09:13:52.790160 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.790135 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4m4l_c79b1bf6-7559-4369-a195-27e8876dde6f/whereabouts-cni-bincopy/0.log" Apr 16 09:13:52.814716 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.814686 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-g4m4l_c79b1bf6-7559-4369-a195-27e8876dde6f/whereabouts-cni/0.log" Apr 16 09:13:52.899088 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.899062 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-t9jnz_bb5bb30d-eff2-430a-b4d1-c40e534c027f/kube-multus/0.log" Apr 16 09:13:52.925135 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.925063 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-88lvl_7146c66d-f9a6-4b2c-8f79-e72ee1b00021/network-metrics-daemon/0.log" Apr 16 09:13:52.951532 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:52.951513 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-88lvl_7146c66d-f9a6-4b2c-8f79-e72ee1b00021/kube-rbac-proxy/0.log" Apr 16 09:13:53.829912 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.829884 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-controller/0.log" Apr 16 09:13:53.849187 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.849165 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/0.log" Apr 16 09:13:53.860084 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.860058 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovn-acl-logging/1.log" Apr 16 09:13:53.878770 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.878726 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/kube-rbac-proxy-node/0.log" Apr 16 09:13:53.900084 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.900057 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 09:13:53.919724 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.919697 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/northd/0.log" Apr 16 09:13:53.941054 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.941027 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/nbdb/0.log" Apr 16 09:13:53.963131 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:53.963094 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/sbdb/0.log" Apr 16 09:13:54.054208 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:54.054178 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2b6j8_542120ec-6600-4198-bcc2-69755cfcf1d4/ovnkube-controller/0.log" Apr 16 09:13:55.826096 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:55.826070 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bfsfg_3f8fae38-02fe-4a19-b915-1b456238b4eb/network-check-target-container/0.log" Apr 16 09:13:56.836740 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:56.836713 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-p2z7c_6ed7597d-e9d4-47fd-acaf-b04d3b412318/iptables-alerter/0.log" Apr 16 09:13:57.499839 ip-10-0-139-144 kubenswrapper[2535]: I0416 09:13:57.499814 2535 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-gxcvx_cd8eb931-b262-4a3d-a87a-1d92a4f2ac91/tuned/0.log"