Apr 16 18:14:36.970818 ip-10-0-139-117 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:14:36.970829 ip-10-0-139-117 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:14:36.970837 ip-10-0-139-117 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:14:36.971147 ip-10-0-139-117 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:14:47.013269 ip-10-0-139-117 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:14:47.013288 ip-10-0-139-117 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 2683b1a1aacf4c4b8f8ab3293ea4bcbc -- Apr 16 18:17:20.905435 ip-10-0-139-117 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:17:21.370610 ip-10-0-139-117 kubenswrapper[2582]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:21.370610 ip-10-0-139-117 kubenswrapper[2582]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:17:21.370610 ip-10-0-139-117 kubenswrapper[2582]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:21.370610 ip-10-0-139-117 kubenswrapper[2582]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:17:21.370610 ip-10-0-139-117 kubenswrapper[2582]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:21.371358 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.371273 2582 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:17:21.375381 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375363 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:21.375381 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375380 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375384 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375387 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375390 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375393 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375396 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375399 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375404 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375408 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375411 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375414 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375417 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375420 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375422 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375425 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375428 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375431 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375433 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375436 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375439 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:21.375449 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375442 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375444 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375447 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375449 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375452 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375455 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375458 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375460 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375464 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375466 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375469 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375472 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375475 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375477 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375480 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375482 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375485 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375488 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375490 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375493 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:21.375920 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375495 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375498 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375503 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375508 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375510 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375513 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375515 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375518 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375521 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375523 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375525 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375528 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375530 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375533 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375536 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375539 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375542 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375545 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375547 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:21.376420 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375550 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375552 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375555 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375557 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375561 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375564 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375566 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375569 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375572 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375574 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375577 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375579 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375582 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375584 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375588 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375590 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375593 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375595 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375598 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375601 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:21.376970 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375603 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375606 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375608 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375611 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375614 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.375617 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377326 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377334 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377337 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377340 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377343 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377347 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377349 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377352 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377355 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377358 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377361 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377364 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377367 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:21.377515 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377370 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377373 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377375 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377378 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377381 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377383 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377386 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377389 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377392 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377395 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377397 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377400 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377402 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377405 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377407 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377411 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377416 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377418 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377421 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:21.377982 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377424 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377428 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377431 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377435 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377437 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377440 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377443 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377446 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377448 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377451 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377454 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377461 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377463 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377466 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377469 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377471 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377474 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377477 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377479 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:21.378474 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377482 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377484 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377487 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377490 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377493 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377495 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377498 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377501 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377503 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377507 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377510 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377513 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377515 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377518 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377521 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377524 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377527 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377529 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377532 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377535 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:21.378939 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377537 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377540 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377544 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377547 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377550 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377553 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377556 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377558 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377561 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377564 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377566 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377569 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377571 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377574 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.377576 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377646 2582 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377654 2582 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377662 2582 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377666 2582 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377671 2582 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377675 2582 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:17:21.379442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377679 2582 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377684 2582 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377688 2582 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377691 2582 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377694 2582 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377698 2582 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377701 2582 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377704 2582 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377707 2582 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377710 2582 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377713 2582 flags.go:64] FLAG: --cloud-config="" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377716 2582 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377719 2582 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377723 2582 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377726 2582 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377729 2582 flags.go:64] FLAG: --config-dir="" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377733 2582 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377737 2582 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377741 2582 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377744 2582 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377747 2582 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377750 2582 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377753 2582 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377757 2582 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:17:21.379947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377760 2582 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377763 2582 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377766 2582 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377771 2582 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377774 2582 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377777 2582 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377780 2582 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377783 2582 flags.go:64] FLAG: --enable-server="true" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377786 2582 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377791 2582 flags.go:64] FLAG: --event-burst="100" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377795 2582 flags.go:64] FLAG: --event-qps="50" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377798 2582 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377801 2582 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377804 2582 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377808 2582 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377811 2582 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377814 2582 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377817 2582 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377821 2582 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377824 2582 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377827 2582 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377830 2582 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377833 2582 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377836 2582 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377840 2582 flags.go:64] FLAG: --feature-gates="" Apr 16 18:17:21.380551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377844 2582 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377848 2582 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377851 2582 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377854 2582 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377857 2582 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377861 2582 flags.go:64] FLAG: --help="false" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377863 2582 flags.go:64] FLAG: --hostname-override="ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377867 2582 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377870 2582 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377873 2582 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377876 2582 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377880 2582 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377883 2582 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377886 2582 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377888 2582 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377891 2582 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377894 2582 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377898 2582 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377901 2582 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377904 2582 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377906 2582 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377909 2582 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377913 2582 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377918 2582 flags.go:64] FLAG: --lock-file="" Apr 16 18:17:21.381191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377922 2582 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377925 2582 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377928 2582 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377933 2582 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377936 2582 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377939 2582 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377943 2582 flags.go:64] FLAG: --logging-format="text" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377945 2582 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377949 2582 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377952 2582 flags.go:64] FLAG: --manifest-url="" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377955 2582 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377960 2582 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377963 2582 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377967 2582 flags.go:64] FLAG: --max-pods="110" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377970 2582 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377973 2582 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377976 2582 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377979 2582 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377982 2582 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377985 2582 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377989 2582 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.377996 2582 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378000 2582 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378003 2582 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:17:21.381768 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378006 2582 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378008 2582 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378015 2582 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378018 2582 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378021 2582 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378024 2582 flags.go:64] FLAG: --port="10250" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378027 2582 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378031 2582 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c85294f3f29ce30d" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378034 2582 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378037 2582 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378040 2582 flags.go:64] FLAG: --register-node="true" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378043 2582 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378045 2582 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378049 2582 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378052 2582 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378055 2582 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378058 2582 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378062 2582 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378065 2582 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378069 2582 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378072 2582 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378075 2582 flags.go:64] FLAG: --runonce="false" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378077 2582 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378081 2582 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378084 2582 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:17:21.382410 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378087 2582 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378105 2582 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378110 2582 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378115 2582 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378120 2582 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378124 2582 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378127 2582 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378131 2582 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378134 2582 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378137 2582 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378140 2582 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378143 2582 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378149 2582 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378152 2582 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378155 2582 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378160 2582 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378162 2582 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378165 2582 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378168 2582 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378171 2582 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378174 2582 flags.go:64] FLAG: --v="2" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378179 2582 flags.go:64] FLAG: --version="false" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378183 2582 flags.go:64] FLAG: --vmodule="" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378188 2582 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.378191 2582 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:17:21.383020 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378286 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378290 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378293 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378296 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378299 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378302 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378304 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378307 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378309 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378312 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378315 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378317 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378320 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378322 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378325 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378327 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378330 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378333 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378335 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378338 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:21.383635 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378341 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378344 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378346 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378349 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378351 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378354 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378357 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378360 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378362 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378365 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378368 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378390 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378394 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378397 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378402 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378406 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378409 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378412 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378415 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:21.384265 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378418 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378420 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378423 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378426 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378429 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378432 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378435 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378437 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378440 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378442 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378445 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378448 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378450 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378453 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378455 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378458 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378462 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378466 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378468 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:21.384847 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378471 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378474 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378480 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378482 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378486 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378488 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378491 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378493 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378496 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378499 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378502 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378505 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378507 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378510 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378512 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378515 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378517 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378520 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378523 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378525 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:21.385584 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378528 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378531 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378533 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378536 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378538 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378541 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378543 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.378548 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:21.386241 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.379078 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:21.387906 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.387885 2582 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:17:21.387946 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.387907 2582 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:17:21.387978 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387963 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:21.387978 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387969 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:21.387978 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387972 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:21.387978 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387975 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:21.387978 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387979 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:21.387978 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387982 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387985 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387988 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387991 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387994 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387996 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.387999 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388002 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388004 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388007 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388010 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388012 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388015 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388018 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388021 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388024 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388027 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388029 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388032 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388035 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:21.388146 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388037 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388041 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388044 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388047 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388049 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388052 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388054 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388057 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388060 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388062 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388065 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388067 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388070 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388072 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388076 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388079 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388082 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388084 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388087 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388102 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:21.388661 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388105 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388108 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388111 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388114 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388117 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388119 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388122 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388125 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388128 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388130 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388133 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388135 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388138 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388141 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388143 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388146 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388150 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388155 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388160 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:21.389212 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388163 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388166 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388169 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388172 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388175 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388177 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388180 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388182 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388185 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388188 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388190 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388193 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388197 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388201 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388204 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388206 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388209 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388211 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388214 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:21.389671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388216 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388219 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388221 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.388228 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388352 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388357 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388361 2582 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388365 2582 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388367 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388370 2582 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388373 2582 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388376 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388379 2582 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388382 2582 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388385 2582 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388387 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:21.390242 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388390 2582 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388392 2582 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388395 2582 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388397 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388400 2582 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388402 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388405 2582 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388408 2582 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388410 2582 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388414 2582 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388418 2582 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388421 2582 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388423 2582 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388426 2582 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388428 2582 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388431 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388434 2582 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388436 2582 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388439 2582 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:21.390647 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388442 2582 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388444 2582 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388447 2582 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388449 2582 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388452 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388455 2582 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388457 2582 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388460 2582 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388463 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388466 2582 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388469 2582 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388473 2582 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388476 2582 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388479 2582 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388481 2582 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388484 2582 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388487 2582 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388489 2582 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388492 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:21.391125 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388494 2582 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388497 2582 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388499 2582 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388502 2582 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388504 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388507 2582 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388509 2582 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388512 2582 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388514 2582 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388517 2582 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388519 2582 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388522 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388524 2582 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388527 2582 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388530 2582 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388532 2582 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388535 2582 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388538 2582 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388540 2582 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388543 2582 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:21.391589 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388546 2582 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388548 2582 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388551 2582 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388553 2582 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388556 2582 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388558 2582 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388561 2582 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388564 2582 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388566 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388569 2582 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388571 2582 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388574 2582 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388576 2582 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388579 2582 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388582 2582 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:21.392080 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:21.388593 2582 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:21.392482 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.388598 2582 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:21.392482 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.389367 2582 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:17:21.393544 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.393529 2582 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:17:21.394509 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.394498 2582 server.go:1019] "Starting client certificate rotation" Apr 16 18:17:21.394606 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.394588 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:21.394641 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.394627 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:21.421528 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.421504 2582 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:21.424225 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.424197 2582 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:21.440318 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.440299 2582 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:17:21.445637 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.445620 2582 log.go:25] "Validated CRI v1 image API" Apr 16 18:17:21.446872 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.446858 2582 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:17:21.452130 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.452086 2582 fs.go:135] Filesystem UUIDs: map[008a628d-2416-44e7-bc4f-9f4018fb6af6:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b61e056d-56a7-4ef1-b5e7-b0ce2d291780:/dev/nvme0n1p3] Apr 16 18:17:21.452204 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.452129 2582 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:17:21.453288 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.453272 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:21.458686 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.458573 2582 manager.go:217] Machine: {Timestamp:2026-04-16 18:17:21.457084816 +0000 UTC m=+0.433201033 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3093179 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2229e09c879f0ef2c494fbf0944cc9 SystemUUID:ec2229e0-9c87-9f0e-f2c4-94fbf0944cc9 BootID:2683b1a1-aacf-4c4b-8f8a-b3293ea4bcbc Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:23:00:5e:da:95 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:23:00:5e:da:95 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:b6:82:7a:86:a8:bd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:17:21.458686 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.458673 2582 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:17:21.458830 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.458761 2582 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:17:21.460216 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.460186 2582 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:17:21.460392 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.460217 2582 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-117.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:17:21.460488 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.460406 2582 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:17:21.460488 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.460418 2582 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:17:21.460488 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.460436 2582 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:21.460488 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.460458 2582 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:21.461884 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.461871 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:21.462021 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.462010 2582 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:17:21.464669 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.464657 2582 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:17:21.464722 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.464674 2582 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:17:21.465402 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.465392 2582 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:17:21.465459 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.465409 2582 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:17:21.465459 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.465425 2582 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:17:21.466704 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.466690 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:21.466786 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.466714 2582 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:21.470032 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.470017 2582 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:17:21.472140 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.472124 2582 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:17:21.473338 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473326 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:17:21.473378 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473343 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:17:21.473378 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473349 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:17:21.473378 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473355 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:17:21.473378 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473360 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:17:21.473378 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473366 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:17:21.473378 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473372 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:17:21.473378 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473377 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:17:21.473573 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473383 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:17:21.473573 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473390 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:17:21.473573 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473399 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:17:21.473573 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.473408 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:17:21.474325 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.474316 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:17:21.474361 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.474326 2582 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:17:21.476793 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.476771 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xgm9g" Apr 16 18:17:21.477087 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.477067 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-117.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:17:21.477298 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.477283 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:17:21.477384 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.477368 2582 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-139-117.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:17:21.478037 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.478025 2582 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:17:21.478082 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.478061 2582 server.go:1295] "Started kubelet" Apr 16 18:17:21.478164 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.478140 2582 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:17:21.478944 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.478712 2582 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:17:21.479045 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.478979 2582 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:17:21.479038 ip-10-0-139-117 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:17:21.479703 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.479661 2582 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:17:21.480280 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.480257 2582 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:17:21.484519 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.484502 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-xgm9g" Apr 16 18:17:21.485869 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.485850 2582 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:17:21.485869 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.485860 2582 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:21.486625 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.486456 2582 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:17:21.486625 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.486478 2582 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:17:21.486625 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.486606 2582 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:17:21.486876 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.486645 2582 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:17:21.486876 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.486655 2582 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:17:21.486967 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.486871 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:21.487069 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.487037 2582 factory.go:55] Registering systemd factory Apr 16 18:17:21.487069 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.487069 2582 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:17:21.487387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.487366 2582 factory.go:153] Registering CRI-O factory Apr 16 18:17:21.487387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.487385 2582 factory.go:223] Registration of the crio container factory successfully Apr 16 18:17:21.487523 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.487447 2582 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:17:21.487523 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.487472 2582 factory.go:103] Registering Raw factory Apr 16 18:17:21.487523 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.487487 2582 manager.go:1196] Started watching for new ooms in manager Apr 16 18:17:21.488068 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.488048 2582 manager.go:319] Starting recovery of all containers Apr 16 18:17:21.488794 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.488766 2582 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:17:21.494822 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.494790 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:21.498545 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.498525 2582 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-117.ec2.internal\" not found" node="ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.499145 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.499131 2582 manager.go:324] Recovery completed Apr 16 18:17:21.503536 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.503522 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:21.505851 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.505829 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:21.505915 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.505858 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:21.505915 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.505869 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:21.506390 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.506376 2582 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:17:21.506390 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.506389 2582 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:17:21.506483 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.506409 2582 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:21.508801 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.508790 2582 policy_none.go:49] "None policy: Start" Apr 16 18:17:21.508847 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.508810 2582 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:17:21.508847 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.508820 2582 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:17:21.543686 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.543672 2582 manager.go:341] "Starting Device Plugin manager" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.543703 2582 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.543716 2582 server.go:85] "Starting device plugin registration server" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.543984 2582 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.543998 2582 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.544107 2582 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.544189 2582 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.544208 2582 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.544667 2582 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:17:21.555357 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.544708 2582 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:21.612669 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.612638 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:17:21.613865 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.613842 2582 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:17:21.613865 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.613866 2582 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:17:21.613996 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.613886 2582 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:17:21.613996 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.613899 2582 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:17:21.613996 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.613936 2582 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:17:21.616439 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.616414 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:21.644480 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.644397 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:21.645413 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.645395 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:21.645513 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.645428 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:21.645513 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.645449 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:21.645513 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.645479 2582 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.654488 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.654468 2582 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.654554 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.654495 2582 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-117.ec2.internal\": node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:21.666582 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.666554 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:21.714357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.714305 2582 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal"] Apr 16 18:17:21.714438 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.714422 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:21.715420 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.715405 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:21.715497 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.715434 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:21.715497 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.715443 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:21.716680 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.716668 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:21.716867 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.716853 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.716917 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.716882 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:21.717437 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.717421 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:21.717519 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.717457 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:21.717519 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.717469 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:21.717519 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.717421 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:21.717620 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.717538 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:21.717620 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.717548 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:21.719319 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.719303 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.719385 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.719337 2582 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:21.720023 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.720001 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:21.720114 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.720046 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:21.720114 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.720058 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:21.743860 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.743835 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-117.ec2.internal\" not found" node="ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.748423 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.748406 2582 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-117.ec2.internal\" not found" node="ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.767646 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.767620 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:21.868052 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.868018 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:21.888647 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.888617 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/38c6a73e51b044e7d224a339aafa2682-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal\" (UID: \"38c6a73e51b044e7d224a339aafa2682\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.888719 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.888659 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38c6a73e51b044e7d224a339aafa2682-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal\" (UID: \"38c6a73e51b044e7d224a339aafa2682\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.888719 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.888695 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c9d1315a36f55d6d5f62663bd6ecf76-config\") pod \"kube-apiserver-proxy-ip-10-0-139-117.ec2.internal\" (UID: \"4c9d1315a36f55d6d5f62663bd6ecf76\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.968931 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:21.968853 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:21.989229 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.989197 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c9d1315a36f55d6d5f62663bd6ecf76-config\") pod \"kube-apiserver-proxy-ip-10-0-139-117.ec2.internal\" (UID: \"4c9d1315a36f55d6d5f62663bd6ecf76\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.989299 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.989235 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/38c6a73e51b044e7d224a339aafa2682-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal\" (UID: \"38c6a73e51b044e7d224a339aafa2682\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.989299 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.989265 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38c6a73e51b044e7d224a339aafa2682-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal\" (UID: \"38c6a73e51b044e7d224a339aafa2682\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.989299 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.989293 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38c6a73e51b044e7d224a339aafa2682-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal\" (UID: \"38c6a73e51b044e7d224a339aafa2682\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.989392 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.989294 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4c9d1315a36f55d6d5f62663bd6ecf76-config\") pod \"kube-apiserver-proxy-ip-10-0-139-117.ec2.internal\" (UID: \"4c9d1315a36f55d6d5f62663bd6ecf76\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" Apr 16 18:17:21.989392 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:21.989301 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/38c6a73e51b044e7d224a339aafa2682-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal\" (UID: \"38c6a73e51b044e7d224a339aafa2682\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:22.047317 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.047285 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:22.050871 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.050850 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" Apr 16 18:17:22.069785 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.069749 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:22.170318 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.170281 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:22.270851 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.270794 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:22.371502 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.371466 2582 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-117.ec2.internal\" not found" Apr 16 18:17:22.393933 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.393907 2582 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:17:22.394081 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.394063 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:22.394158 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.394082 2582 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:22.454883 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.454856 2582 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:22.466029 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.466000 2582 apiserver.go:52] "Watching apiserver" Apr 16 18:17:22.482947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.482917 2582 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:17:22.483256 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.483236 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-4mngt","openshift-image-registry/node-ca-n229m","openshift-multus/multus-additional-cni-plugins-hpm7j","openshift-multus/network-metrics-daemon-4zpgf","openshift-network-diagnostics/network-check-target-crg6m","openshift-network-operator/iptables-alerter-qgnq9","openshift-ovn-kubernetes/ovnkube-node-2frx4","kube-system/konnectivity-agent-7t997","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54","openshift-cluster-node-tuning-operator/tuned-v2mgc"] Apr 16 18:17:22.486045 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.486014 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" Apr 16 18:17:22.486166 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.486061 2582 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:22.486166 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.486061 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:12:21 +0000 UTC" deadline="2027-12-24 01:03:07.628915114 +0000 UTC" Apr 16 18:17:22.486166 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.486117 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14790h45m45.142802395s" Apr 16 18:17:22.486393 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.486379 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.486502 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.486480 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.487650 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.487631 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.489884 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.489865 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:22.489986 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.489957 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:22.490824 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.490808 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:22.490911 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.490869 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:22.492147 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492128 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.492231 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492157 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-cnibin\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492231 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492193 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-k8s-cni-cncf-io\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492231 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492220 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-daemon-config\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492243 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bead58a1-c7d1-4221-8dba-7355ad1eee28-host\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.492387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492272 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-cni-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492295 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.492387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492318 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-socket-dir-parent\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492342 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-hostroot\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492364 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-etc-kubernetes\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492401 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492421 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-netns\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492436 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-kubelet\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492451 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9fmg\" (UniqueName: \"kubernetes.io/projected/bead58a1-c7d1-4221-8dba-7355ad1eee28-kube-api-access-g9fmg\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492468 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-system-cni-dir\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492489 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-os-release\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492506 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdd46\" (UniqueName: \"kubernetes.io/projected/c19385c2-b1c2-45bc-a50b-91342bfe5265-kube-api-access-sdd46\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492522 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-system-cni-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492538 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12cb712b-2a1d-4af9-a5dc-79912365f003-cni-binary-copy\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492559 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-cni-bin\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.492605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492596 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-cni-multus\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492620 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-conf-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492637 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-multus-certs\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492677 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzd5h\" (UniqueName: \"kubernetes.io/projected/12cb712b-2a1d-4af9-a5dc-79912365f003-kube-api-access-fzd5h\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492693 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bead58a1-c7d1-4221-8dba-7355ad1eee28-serviceca\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492707 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-cni-binary-copy\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492733 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-os-release\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492756 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-cnibin\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.493043 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.492781 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.493516 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.493502 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.494230 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.494207 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:17:22.494326 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.494207 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:17:22.494326 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.494213 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:17:22.494897 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.494874 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:17:22.494990 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.494911 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:17:22.494990 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.494985 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:17:22.495128 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.495005 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kfgpq\"" Apr 16 18:17:22.495201 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.495162 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.495551 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.495530 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:22.495641 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.495581 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:17:22.495641 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.495584 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:17:22.496022 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.496009 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5cb9n\"" Apr 16 18:17:22.496617 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.496575 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-d4rm7\"" Apr 16 18:17:22.496796 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.496783 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:17:22.496846 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.496813 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.497350 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.497332 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:17:22.497754 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.497741 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pr74h\"" Apr 16 18:17:22.498062 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.498044 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:17:22.498237 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.498090 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zzn7w\"" Apr 16 18:17:22.498237 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.498150 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.499186 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499169 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:17:22.499262 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499208 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:17:22.499607 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499488 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:17:22.499607 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499495 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:17:22.499874 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499721 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:17:22.499874 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499737 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:17:22.499874 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499780 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:17:22.499874 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.499794 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:17:22.501145 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.500470 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:17:22.501145 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.500543 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fwp8m\"" Apr 16 18:17:22.501145 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.500614 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:17:22.501145 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.500781 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nk5z8\"" Apr 16 18:17:22.501412 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.501399 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-t4g87\"" Apr 16 18:17:22.501529 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.501487 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:22.502750 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.502726 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:22.503067 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.503049 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:22.509169 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.509152 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal"] Apr 16 18:17:22.509864 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.509843 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:22.509943 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.509914 2582 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" Apr 16 18:17:22.516175 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.516151 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c9d1315a36f55d6d5f62663bd6ecf76.slice/crio-302c220a55149e034fdac28c823a3b38f6e5ea0aa6876b963a1eeabe5a0ac098 WatchSource:0}: Error finding container 302c220a55149e034fdac28c823a3b38f6e5ea0aa6876b963a1eeabe5a0ac098: Status 404 returned error can't find the container with id 302c220a55149e034fdac28c823a3b38f6e5ea0aa6876b963a1eeabe5a0ac098 Apr 16 18:17:22.516353 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.516335 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38c6a73e51b044e7d224a339aafa2682.slice/crio-f07313766a40e165de3f8c3a16502afa697dfd7947d511b12437c1e5096d98ed WatchSource:0}: Error finding container f07313766a40e165de3f8c3a16502afa697dfd7947d511b12437c1e5096d98ed: Status 404 returned error can't find the container with id f07313766a40e165de3f8c3a16502afa697dfd7947d511b12437c1e5096d98ed Apr 16 18:17:22.519786 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.519765 2582 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:22.520629 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.520612 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal"] Apr 16 18:17:22.520888 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.520845 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:22.522974 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.522958 2582 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:22.584296 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.584269 2582 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-677gs" Apr 16 18:17:22.588048 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.588028 2582 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:17:22.593878 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.593859 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn7lw\" (UniqueName: \"kubernetes.io/projected/14a34dbf-ea86-4998-809d-5078b679506c-kube-api-access-hn7lw\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.593948 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.593886 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-systemd-units\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.593948 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.593902 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-log-socket\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.593948 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.593919 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-cni-netd\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594037 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.593943 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594037 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.593967 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovn-node-metrics-cert\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594037 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.593988 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovnkube-script-lib\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594037 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594007 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-socket-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.594037 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594026 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-cni-binary-copy\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.594203 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594042 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-lib-modules\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.594203 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594064 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-host\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.594203 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594078 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnqx2\" (UniqueName: \"kubernetes.io/projected/847e2695-c897-4ed9-95c4-10d0fbef9e09-kube-api-access-qnqx2\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:22.594203 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594108 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.594203 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594152 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-k8s-cni-cncf-io\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594347 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594203 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-k8s-cni-cncf-io\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594347 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594210 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-daemon-config\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594347 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594249 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bead58a1-c7d1-4221-8dba-7355ad1eee28-host\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.594347 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594294 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bead58a1-c7d1-4221-8dba-7355ad1eee28-host\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.594347 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594307 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14a34dbf-ea86-4998-809d-5078b679506c-host-slash\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.594347 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594336 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovnkube-config\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594361 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-cni-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594392 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-var-lib-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594417 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysctl-d\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594444 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-etc-kubernetes\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594469 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-node-log\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594492 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-etc-selinux\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594523 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-kubelet\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594549 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594548 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-cni-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594496 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-etc-kubernetes\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594552 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-etc-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594601 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-cni-bin\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594610 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-kubelet\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594631 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-cni-binary-copy\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594648 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-device-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594682 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594716 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-os-release\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594753 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-slash\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594757 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-daemon-config\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594783 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3bf7ebd6-29a1-4bfa-957a-f30502b557e5-konnectivity-ca\") pod \"konnectivity-agent-7t997\" (UID: \"3bf7ebd6-29a1-4bfa-957a-f30502b557e5\") " pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594806 2582 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-677gs" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594810 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0e2d86c-c5dd-4964-a804-52d78bb9d593-tmp\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594815 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-os-release\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594837 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-cni-bin\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.594861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594860 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-cni-multus\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594882 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594897 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-cni-bin\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594885 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzd5h\" (UniqueName: \"kubernetes.io/projected/12cb712b-2a1d-4af9-a5dc-79912365f003-kube-api-access-fzd5h\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594926 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-var-lib-cni-multus\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594944 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-run-ovn-kubernetes\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594964 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-sys-fs\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.594987 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-var-lib-kubelet\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595006 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-run-netns\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595033 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-systemd\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595087 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-env-overrides\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595121 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3bf7ebd6-29a1-4bfa-957a-f30502b557e5-agent-certs\") pod \"konnectivity-agent-7t997\" (UID: \"3bf7ebd6-29a1-4bfa-957a-f30502b557e5\") " pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595145 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-os-release\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595161 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn4mj\" (UniqueName: \"kubernetes.io/projected/74787fd3-6aff-45fa-b4f4-4f97b01f0899-kube-api-access-vn4mj\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595192 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-systemd\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595215 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-os-release\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595220 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-run\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.595382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595246 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-cnibin\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595290 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-cnibin\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595293 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-cnibin\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595323 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmxch\" (UniqueName: \"kubernetes.io/projected/e0e2d86c-c5dd-4964-a804-52d78bb9d593-kube-api-access-hmxch\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595332 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-cnibin\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595346 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-ovn\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595369 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t9ks\" (UniqueName: \"kubernetes.io/projected/f6f826d5-a016-4f49-8153-5a1a3cd21012-kube-api-access-2t9ks\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595390 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595407 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-socket-dir-parent\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595426 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-hostroot\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595449 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-kubernetes\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595480 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-hostroot\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595482 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-socket-dir-parent\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595495 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595520 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-netns\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595543 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9fmg\" (UniqueName: \"kubernetes.io/projected/bead58a1-c7d1-4221-8dba-7355ad1eee28-kube-api-access-g9fmg\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595569 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:22.595911 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595575 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-netns\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595596 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14a34dbf-ea86-4998-809d-5078b679506c-iptables-alerter-script\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595620 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-kubelet\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595642 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595667 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-kubelet-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595693 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-system-cni-dir\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595747 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c19385c2-b1c2-45bc-a50b-91342bfe5265-system-cni-dir\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595740 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdd46\" (UniqueName: \"kubernetes.io/projected/c19385c2-b1c2-45bc-a50b-91342bfe5265-kube-api-access-sdd46\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595786 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-system-cni-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595839 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-registration-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595868 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysctl-conf\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595874 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-system-cni-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595884 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595892 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-sys\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595946 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12cb712b-2a1d-4af9-a5dc-79912365f003-cni-binary-copy\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595951 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c19385c2-b1c2-45bc-a50b-91342bfe5265-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595970 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-conf-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.596531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.595997 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-multus-certs\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.596004 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-multus-conf-dir\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.596030 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bead58a1-c7d1-4221-8dba-7355ad1eee28-serviceca\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.596064 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-modprobe-d\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.596106 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysconfig\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.596139 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/12cb712b-2a1d-4af9-a5dc-79912365f003-host-run-multus-certs\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.596141 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-tuned\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.597029 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/12cb712b-2a1d-4af9-a5dc-79912365f003-cni-binary-copy\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.597124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.596816 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bead58a1-c7d1-4221-8dba-7355ad1eee28-serviceca\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.607943 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.607920 2582 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:17:22.611157 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.611139 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzd5h\" (UniqueName: \"kubernetes.io/projected/12cb712b-2a1d-4af9-a5dc-79912365f003-kube-api-access-fzd5h\") pod \"multus-4mngt\" (UID: \"12cb712b-2a1d-4af9-a5dc-79912365f003\") " pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.611983 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.611953 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9fmg\" (UniqueName: \"kubernetes.io/projected/bead58a1-c7d1-4221-8dba-7355ad1eee28-kube-api-access-g9fmg\") pod \"node-ca-n229m\" (UID: \"bead58a1-c7d1-4221-8dba-7355ad1eee28\") " pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.613389 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.613372 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdd46\" (UniqueName: \"kubernetes.io/projected/c19385c2-b1c2-45bc-a50b-91342bfe5265-kube-api-access-sdd46\") pod \"multus-additional-cni-plugins-hpm7j\" (UID: \"c19385c2-b1c2-45bc-a50b-91342bfe5265\") " pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.616807 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.616773 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" event={"ID":"38c6a73e51b044e7d224a339aafa2682","Type":"ContainerStarted","Data":"f07313766a40e165de3f8c3a16502afa697dfd7947d511b12437c1e5096d98ed"} Apr 16 18:17:22.617617 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.617600 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" event={"ID":"4c9d1315a36f55d6d5f62663bd6ecf76","Type":"ContainerStarted","Data":"302c220a55149e034fdac28c823a3b38f6e5ea0aa6876b963a1eeabe5a0ac098"} Apr 16 18:17:22.697364 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697333 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-systemd-units\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697364 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697365 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-log-socket\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697384 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-cni-netd\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697405 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697431 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovn-node-metrics-cert\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697450 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-log-socket\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697465 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-cni-netd\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697483 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697500 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-systemd-units\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697562 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovnkube-script-lib\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697593 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697590 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-socket-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697609 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-lib-modules\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697633 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-host\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697657 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnqx2\" (UniqueName: \"kubernetes.io/projected/847e2695-c897-4ed9-95c4-10d0fbef9e09-kube-api-access-qnqx2\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697683 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14a34dbf-ea86-4998-809d-5078b679506c-host-slash\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697703 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-host\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697708 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovnkube-config\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697731 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-lib-modules\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697738 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-socket-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697755 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-var-lib-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697780 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14a34dbf-ea86-4998-809d-5078b679506c-host-slash\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697783 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysctl-d\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697820 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-node-log\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697846 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-etc-selinux\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697856 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-var-lib-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697874 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-etc-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697888 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysctl-d\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.697954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697928 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-node-log\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697960 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-cni-bin\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.697998 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-etc-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698001 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-device-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698037 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698037 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-cni-bin\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698071 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-slash\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698124 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3bf7ebd6-29a1-4bfa-957a-f30502b557e5-konnectivity-ca\") pod \"konnectivity-agent-7t997\" (UID: \"3bf7ebd6-29a1-4bfa-957a-f30502b557e5\") " pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698138 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-device-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698144 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-etc-selinux\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698156 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0e2d86c-c5dd-4964-a804-52d78bb9d593-tmp\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.698140 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698168 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-slash\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698211 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-run-ovn-kubernetes\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698221 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovnkube-config\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698220 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovnkube-script-lib\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.698253 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:23.198228157 +0000 UTC m=+2.174344365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:22.698735 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698255 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-run-ovn-kubernetes\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698277 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-sys-fs\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698313 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-var-lib-kubelet\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698334 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-sys-fs\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698339 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-run-netns\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698368 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-systemd\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698375 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-run-netns\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698392 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-env-overrides\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698398 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-var-lib-kubelet\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698414 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3bf7ebd6-29a1-4bfa-957a-f30502b557e5-agent-certs\") pod \"konnectivity-agent-7t997\" (UID: \"3bf7ebd6-29a1-4bfa-957a-f30502b557e5\") " pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698416 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-systemd\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698443 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vn4mj\" (UniqueName: \"kubernetes.io/projected/74787fd3-6aff-45fa-b4f4-4f97b01f0899-kube-api-access-vn4mj\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698469 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-systemd\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698487 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-run\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698537 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hmxch\" (UniqueName: \"kubernetes.io/projected/e0e2d86c-c5dd-4964-a804-52d78bb9d593-kube-api-access-hmxch\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698564 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-ovn\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698589 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2t9ks\" (UniqueName: \"kubernetes.io/projected/f6f826d5-a016-4f49-8153-5a1a3cd21012-kube-api-access-2t9ks\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.699453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698616 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-kubernetes\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698660 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698685 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14a34dbf-ea86-4998-809d-5078b679506c-iptables-alerter-script\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698688 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3bf7ebd6-29a1-4bfa-957a-f30502b557e5-konnectivity-ca\") pod \"konnectivity-agent-7t997\" (UID: \"3bf7ebd6-29a1-4bfa-957a-f30502b557e5\") " pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698709 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-kubelet\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698734 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698758 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-ovn\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698771 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-kubelet-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698789 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/74787fd3-6aff-45fa-b4f4-4f97b01f0899-env-overrides\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698801 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-registration-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698831 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-run\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698839 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysctl-conf\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698854 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-registration-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698868 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-sys\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698880 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-systemd\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698900 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-modprobe-d\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698928 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysconfig\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.699935 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698928 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-run-openvswitch\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698951 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-tuned\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698963 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-kubernetes\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698976 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/74787fd3-6aff-45fa-b4f4-4f97b01f0899-host-kubelet\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.698984 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn7lw\" (UniqueName: \"kubernetes.io/projected/14a34dbf-ea86-4998-809d-5078b679506c-kube-api-access-hn7lw\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.699020 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysconfig\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.699060 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-modprobe-d\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.699069 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f6f826d5-a016-4f49-8153-5a1a3cd21012-kubelet-dir\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.699128 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-sys\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.699236 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-sysctl-conf\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.699406 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14a34dbf-ea86-4998-809d-5078b679506c-iptables-alerter-script\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.700227 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/74787fd3-6aff-45fa-b4f4-4f97b01f0899-ovn-node-metrics-cert\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.700520 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.700392 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e0e2d86c-c5dd-4964-a804-52d78bb9d593-tmp\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.701170 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.701152 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e0e2d86c-c5dd-4964-a804-52d78bb9d593-etc-tuned\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.701368 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.701353 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3bf7ebd6-29a1-4bfa-957a-f30502b557e5-agent-certs\") pod \"konnectivity-agent-7t997\" (UID: \"3bf7ebd6-29a1-4bfa-957a-f30502b557e5\") " pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.709196 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.709165 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:22.709196 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.709189 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:22.709309 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.709221 2582 projected.go:194] Error preparing data for projected volume kube-api-access-rzzsq for pod openshift-network-diagnostics/network-check-target-crg6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:22.709350 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:22.709324 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq podName:439470b0-687a-4bea-ad03-3eebe6cb41cd nodeName:}" failed. No retries permitted until 2026-04-16 18:17:23.209281618 +0000 UTC m=+2.185397825 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rzzsq" (UniqueName: "kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq") pod "network-check-target-crg6m" (UID: "439470b0-687a-4bea-ad03-3eebe6cb41cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:22.710326 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.710298 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnqx2\" (UniqueName: \"kubernetes.io/projected/847e2695-c897-4ed9-95c4-10d0fbef9e09-kube-api-access-qnqx2\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:22.710460 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.710444 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn4mj\" (UniqueName: \"kubernetes.io/projected/74787fd3-6aff-45fa-b4f4-4f97b01f0899-kube-api-access-vn4mj\") pod \"ovnkube-node-2frx4\" (UID: \"74787fd3-6aff-45fa-b4f4-4f97b01f0899\") " pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.711124 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.711089 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn7lw\" (UniqueName: \"kubernetes.io/projected/14a34dbf-ea86-4998-809d-5078b679506c-kube-api-access-hn7lw\") pod \"iptables-alerter-qgnq9\" (UID: \"14a34dbf-ea86-4998-809d-5078b679506c\") " pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.711455 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.711440 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmxch\" (UniqueName: \"kubernetes.io/projected/e0e2d86c-c5dd-4964-a804-52d78bb9d593-kube-api-access-hmxch\") pod \"tuned-v2mgc\" (UID: \"e0e2d86c-c5dd-4964-a804-52d78bb9d593\") " pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.711658 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.711638 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t9ks\" (UniqueName: \"kubernetes.io/projected/f6f826d5-a016-4f49-8153-5a1a3cd21012-kube-api-access-2t9ks\") pod \"aws-ebs-csi-driver-node-p7h54\" (UID: \"f6f826d5-a016-4f49-8153-5a1a3cd21012\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.822190 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.822085 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-4mngt" Apr 16 18:17:22.827855 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.827829 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-n229m" Apr 16 18:17:22.829017 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.828986 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12cb712b_2a1d_4af9_a5dc_79912365f003.slice/crio-6859a1ada9feff8704196718d36742781cfd45ee74a58852a6cde0dfedacf243 WatchSource:0}: Error finding container 6859a1ada9feff8704196718d36742781cfd45ee74a58852a6cde0dfedacf243: Status 404 returned error can't find the container with id 6859a1ada9feff8704196718d36742781cfd45ee74a58852a6cde0dfedacf243 Apr 16 18:17:22.834333 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.834309 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbead58a1_c7d1_4221_8dba_7355ad1eee28.slice/crio-ab281f662003357a3e59f9fd1fd8ce39f582ffcef94c8a94b0e0f16371a99315 WatchSource:0}: Error finding container ab281f662003357a3e59f9fd1fd8ce39f582ffcef94c8a94b0e0f16371a99315: Status 404 returned error can't find the container with id ab281f662003357a3e59f9fd1fd8ce39f582ffcef94c8a94b0e0f16371a99315 Apr 16 18:17:22.834887 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.834867 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" Apr 16 18:17:22.840010 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.839987 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgnq9" Apr 16 18:17:22.843505 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.843475 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc19385c2_b1c2_45bc_a50b_91342bfe5265.slice/crio-00d875eca233a2ca5870a53d7b0be2128e0ab302a609654d7bef2cbf37e0899d WatchSource:0}: Error finding container 00d875eca233a2ca5870a53d7b0be2128e0ab302a609654d7bef2cbf37e0899d: Status 404 returned error can't find the container with id 00d875eca233a2ca5870a53d7b0be2128e0ab302a609654d7bef2cbf37e0899d Apr 16 18:17:22.844721 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.844704 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:22.849260 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.849230 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14a34dbf_ea86_4998_809d_5078b679506c.slice/crio-ab0de7362bcd56773cb519e4e9c823bf5c3af39cb81947266be2cc6d61d08161 WatchSource:0}: Error finding container ab0de7362bcd56773cb519e4e9c823bf5c3af39cb81947266be2cc6d61d08161: Status 404 returned error can't find the container with id ab0de7362bcd56773cb519e4e9c823bf5c3af39cb81947266be2cc6d61d08161 Apr 16 18:17:22.850287 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.850270 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:22.853791 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.853765 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74787fd3_6aff_45fa_b4f4_4f97b01f0899.slice/crio-cd1374878912da080fda0017f6754d1c0da9bfd7348099b4102d17d72cdb6377 WatchSource:0}: Error finding container cd1374878912da080fda0017f6754d1c0da9bfd7348099b4102d17d72cdb6377: Status 404 returned error can't find the container with id cd1374878912da080fda0017f6754d1c0da9bfd7348099b4102d17d72cdb6377 Apr 16 18:17:22.855626 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.855604 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" Apr 16 18:17:22.857160 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.857140 2582 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:22.857901 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.857878 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bf7ebd6_29a1_4bfa_957a_f30502b557e5.slice/crio-27547562c3407e2093a7b35c3f60ef36d9934be6f185b995f88bdbeaced6fa18 WatchSource:0}: Error finding container 27547562c3407e2093a7b35c3f60ef36d9934be6f185b995f88bdbeaced6fa18: Status 404 returned error can't find the container with id 27547562c3407e2093a7b35c3f60ef36d9934be6f185b995f88bdbeaced6fa18 Apr 16 18:17:22.860986 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:22.860965 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" Apr 16 18:17:22.863209 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.863184 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6f826d5_a016_4f49_8153_5a1a3cd21012.slice/crio-53683278c48c56944fe26bde3dedcc6165535b561e0380011ae263326b7e7445 WatchSource:0}: Error finding container 53683278c48c56944fe26bde3dedcc6165535b561e0380011ae263326b7e7445: Status 404 returned error can't find the container with id 53683278c48c56944fe26bde3dedcc6165535b561e0380011ae263326b7e7445 Apr 16 18:17:22.868004 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:17:22.867982 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0e2d86c_c5dd_4964_a804_52d78bb9d593.slice/crio-acb7133bcf75718c454222f9c0886fdb74a018f2d749deb20e892a15950f9870 WatchSource:0}: Error finding container acb7133bcf75718c454222f9c0886fdb74a018f2d749deb20e892a15950f9870: Status 404 returned error can't find the container with id acb7133bcf75718c454222f9c0886fdb74a018f2d749deb20e892a15950f9870 Apr 16 18:17:23.204243 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.203487 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:23.204243 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:23.203659 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:23.204243 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:23.203738 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:24.203708362 +0000 UTC m=+3.179824554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:23.305011 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.304314 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:23.305011 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:23.304550 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:23.305011 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:23.304572 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:23.305011 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:23.304596 2582 projected.go:194] Error preparing data for projected volume kube-api-access-rzzsq for pod openshift-network-diagnostics/network-check-target-crg6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:23.305011 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:23.304655 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq podName:439470b0-687a-4bea-ad03-3eebe6cb41cd nodeName:}" failed. No retries permitted until 2026-04-16 18:17:24.30463636 +0000 UTC m=+3.280752554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzzsq" (UniqueName: "kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq") pod "network-check-target-crg6m" (UID: "439470b0-687a-4bea-ad03-3eebe6cb41cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:23.335385 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.335353 2582 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:23.596950 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.596807 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:22 +0000 UTC" deadline="2028-01-10 20:41:22.33135929 +0000 UTC" Apr 16 18:17:23.596950 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.596849 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15218h23m58.73451461s" Apr 16 18:17:23.631521 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.631451 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" event={"ID":"e0e2d86c-c5dd-4964-a804-52d78bb9d593","Type":"ContainerStarted","Data":"acb7133bcf75718c454222f9c0886fdb74a018f2d749deb20e892a15950f9870"} Apr 16 18:17:23.644289 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.644204 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" event={"ID":"f6f826d5-a016-4f49-8153-5a1a3cd21012","Type":"ContainerStarted","Data":"53683278c48c56944fe26bde3dedcc6165535b561e0380011ae263326b7e7445"} Apr 16 18:17:23.654625 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.654570 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"cd1374878912da080fda0017f6754d1c0da9bfd7348099b4102d17d72cdb6377"} Apr 16 18:17:23.669846 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.669803 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n229m" event={"ID":"bead58a1-c7d1-4221-8dba-7355ad1eee28","Type":"ContainerStarted","Data":"ab281f662003357a3e59f9fd1fd8ce39f582ffcef94c8a94b0e0f16371a99315"} Apr 16 18:17:23.687046 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.687002 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7t997" event={"ID":"3bf7ebd6-29a1-4bfa-957a-f30502b557e5","Type":"ContainerStarted","Data":"27547562c3407e2093a7b35c3f60ef36d9934be6f185b995f88bdbeaced6fa18"} Apr 16 18:17:23.692437 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.692327 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgnq9" event={"ID":"14a34dbf-ea86-4998-809d-5078b679506c","Type":"ContainerStarted","Data":"ab0de7362bcd56773cb519e4e9c823bf5c3af39cb81947266be2cc6d61d08161"} Apr 16 18:17:23.705799 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.705583 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerStarted","Data":"00d875eca233a2ca5870a53d7b0be2128e0ab302a609654d7bef2cbf37e0899d"} Apr 16 18:17:23.723276 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.723231 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mngt" event={"ID":"12cb712b-2a1d-4af9-a5dc-79912365f003","Type":"ContainerStarted","Data":"6859a1ada9feff8704196718d36742781cfd45ee74a58852a6cde0dfedacf243"} Apr 16 18:17:23.959400 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:23.959085 2582 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:24.211169 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:24.210526 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:24.211169 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.210696 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:24.211169 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.210759 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:26.210739576 +0000 UTC m=+5.186855781 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:24.311757 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:24.311053 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:24.311757 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.311312 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:24.311757 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.311333 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:24.311757 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.311345 2582 projected.go:194] Error preparing data for projected volume kube-api-access-rzzsq for pod openshift-network-diagnostics/network-check-target-crg6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:24.311757 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.311406 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq podName:439470b0-687a-4bea-ad03-3eebe6cb41cd nodeName:}" failed. No retries permitted until 2026-04-16 18:17:26.311387137 +0000 UTC m=+5.287503331 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzzsq" (UniqueName: "kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq") pod "network-check-target-crg6m" (UID: "439470b0-687a-4bea-ad03-3eebe6cb41cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:24.598119 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:24.597985 2582 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:22 +0000 UTC" deadline="2028-01-03 00:41:04.874146912 +0000 UTC" Apr 16 18:17:24.598119 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:24.598027 2582 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15030h23m40.276123863s" Apr 16 18:17:24.615625 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:24.614913 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:24.615625 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.615041 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:24.615625 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:24.615481 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:24.615625 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:24.615580 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:26.225463 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:26.225390 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:26.225914 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.225572 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:26.225914 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.225654 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:30.225633315 +0000 UTC m=+9.201749508 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:26.326227 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:26.326120 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:26.326399 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.326252 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:26.326399 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.326278 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:26.326399 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.326322 2582 projected.go:194] Error preparing data for projected volume kube-api-access-rzzsq for pod openshift-network-diagnostics/network-check-target-crg6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:26.326399 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.326389 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq podName:439470b0-687a-4bea-ad03-3eebe6cb41cd nodeName:}" failed. No retries permitted until 2026-04-16 18:17:30.326370277 +0000 UTC m=+9.302486472 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzzsq" (UniqueName: "kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq") pod "network-check-target-crg6m" (UID: "439470b0-687a-4bea-ad03-3eebe6cb41cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:26.614942 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:26.614394 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:26.614942 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.614532 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:26.614942 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:26.614600 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:26.614942 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:26.614712 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:27.073159 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.073060 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cj779"] Apr 16 18:17:27.074925 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.074896 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.077821 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.077780 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:17:27.078432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.078258 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-824f6\"" Apr 16 18:17:27.078432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.078294 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:17:27.132281 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.132246 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f68745e4-3c2b-4cbe-80a9-80320d887584-hosts-file\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.132442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.132313 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szdk9\" (UniqueName: \"kubernetes.io/projected/f68745e4-3c2b-4cbe-80a9-80320d887584-kube-api-access-szdk9\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.132442 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.132354 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f68745e4-3c2b-4cbe-80a9-80320d887584-tmp-dir\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.234082 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.233123 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f68745e4-3c2b-4cbe-80a9-80320d887584-tmp-dir\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.234082 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.233227 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f68745e4-3c2b-4cbe-80a9-80320d887584-hosts-file\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.234082 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.233268 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-szdk9\" (UniqueName: \"kubernetes.io/projected/f68745e4-3c2b-4cbe-80a9-80320d887584-kube-api-access-szdk9\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.234082 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.233908 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f68745e4-3c2b-4cbe-80a9-80320d887584-tmp-dir\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.234082 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.234045 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f68745e4-3c2b-4cbe-80a9-80320d887584-hosts-file\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.246712 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.246682 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-szdk9\" (UniqueName: \"kubernetes.io/projected/f68745e4-3c2b-4cbe-80a9-80320d887584-kube-api-access-szdk9\") pod \"node-resolver-cj779\" (UID: \"f68745e4-3c2b-4cbe-80a9-80320d887584\") " pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:27.387830 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:27.387744 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cj779" Apr 16 18:17:28.614685 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:28.614642 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:28.615197 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:28.614779 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:28.615197 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:28.614642 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:28.615333 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:28.615235 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:30.256687 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:30.256616 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:30.257155 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.256790 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:30.257155 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.256868 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.256847158 +0000 UTC m=+17.232963353 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:30.357993 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:30.357923 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:30.358197 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.358119 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:30.358197 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.358147 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:30.358197 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.358162 2582 projected.go:194] Error preparing data for projected volume kube-api-access-rzzsq for pod openshift-network-diagnostics/network-check-target-crg6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:30.358347 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.358231 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq podName:439470b0-687a-4bea-ad03-3eebe6cb41cd nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.358209955 +0000 UTC m=+17.334326158 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzzsq" (UniqueName: "kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq") pod "network-check-target-crg6m" (UID: "439470b0-687a-4bea-ad03-3eebe6cb41cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:30.615042 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:30.614961 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:30.615228 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:30.615045 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:30.615228 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.615129 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:30.615228 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:30.615193 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:32.615110 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:32.615058 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:32.615522 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:32.615072 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:32.615522 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:32.615200 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:32.615522 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:32.615318 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:34.614953 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:34.614917 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:34.614953 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:34.614938 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:34.615440 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:34.615022 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:34.615779 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:34.615180 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:36.614899 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:36.614865 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:36.615383 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:36.614865 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:36.615383 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:36.614992 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:36.615383 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:36.615062 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:38.316079 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:38.316032 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:38.316524 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.316198 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:38.316524 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.316272 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:54.316247529 +0000 UTC m=+33.292363718 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:38.416452 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:38.416405 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:38.416636 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.416572 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:38.416636 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.416602 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:38.416636 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.416615 2582 projected.go:194] Error preparing data for projected volume kube-api-access-rzzsq for pod openshift-network-diagnostics/network-check-target-crg6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:38.416805 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.416670 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq podName:439470b0-687a-4bea-ad03-3eebe6cb41cd nodeName:}" failed. No retries permitted until 2026-04-16 18:17:54.41665553 +0000 UTC m=+33.392771718 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzzsq" (UniqueName: "kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq") pod "network-check-target-crg6m" (UID: "439470b0-687a-4bea-ad03-3eebe6cb41cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:38.615050 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:38.614972 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:38.615193 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:38.614972 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:38.615193 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.615084 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:38.615193 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:38.615172 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:40.615111 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:40.614934 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:40.615428 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:40.614944 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:40.615428 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:40.615210 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:40.615428 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:40.615242 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:40.754878 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:40.754837 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cj779" event={"ID":"f68745e4-3c2b-4cbe-80a9-80320d887584","Type":"ContainerStarted","Data":"49172697b7c5d33824246d8d90b3bbdd9a6e86b35fabbfdf67cdd54c8bc30d50"} Apr 16 18:17:41.758056 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.757608 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cj779" event={"ID":"f68745e4-3c2b-4cbe-80a9-80320d887584","Type":"ContainerStarted","Data":"8c369364db29ab8f3dcf1ad80ff9a6e53e4dc4d7a86d0af57b6be5f90e622c59"} Apr 16 18:17:41.758874 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.758847 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7t997" event={"ID":"3bf7ebd6-29a1-4bfa-957a-f30502b557e5","Type":"ContainerStarted","Data":"8c1049c483a40bd689fa7a6b05f2eafffcef9d9edca6008cc1fefe41da54ada6"} Apr 16 18:17:41.760064 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.760042 2582 generic.go:358] "Generic (PLEG): container finished" podID="c19385c2-b1c2-45bc-a50b-91342bfe5265" containerID="377651d3a2e3ff690164ada8ee8847447089d2644e5eb8eeaf408e115feaee81" exitCode=0 Apr 16 18:17:41.760202 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.760128 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerDied","Data":"377651d3a2e3ff690164ada8ee8847447089d2644e5eb8eeaf408e115feaee81"} Apr 16 18:17:41.761570 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.761540 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-4mngt" event={"ID":"12cb712b-2a1d-4af9-a5dc-79912365f003","Type":"ContainerStarted","Data":"fd75c6948519a42723b65f39eb20724e56af180e6202465168deaa267a7e500d"} Apr 16 18:17:41.762970 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.762926 2582 generic.go:358] "Generic (PLEG): container finished" podID="38c6a73e51b044e7d224a339aafa2682" containerID="331f037e2dd3d4d679ab702d8a509eb45634f928aaeb502e244d389ad98f363a" exitCode=0 Apr 16 18:17:41.763029 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.762981 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" event={"ID":"38c6a73e51b044e7d224a339aafa2682","Type":"ContainerDied","Data":"331f037e2dd3d4d679ab702d8a509eb45634f928aaeb502e244d389ad98f363a"} Apr 16 18:17:41.764322 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.764286 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" event={"ID":"4c9d1315a36f55d6d5f62663bd6ecf76","Type":"ContainerStarted","Data":"6d6fde33bd5fcbba15473c57f05fe27b738206ba4dd2148934582ab77c2d394a"} Apr 16 18:17:41.765631 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.765611 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" event={"ID":"e0e2d86c-c5dd-4964-a804-52d78bb9d593","Type":"ContainerStarted","Data":"f0b543c31a2e434a6f6f46e71feebf44f708502c0f13b2c69aa7d38bd5f4359c"} Apr 16 18:17:41.766956 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.766932 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" event={"ID":"f6f826d5-a016-4f49-8153-5a1a3cd21012","Type":"ContainerStarted","Data":"0cf9a617558921bd3f1572025f685847016d7923940e04739662291604a7728d"} Apr 16 18:17:41.769238 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769221 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:17:41.769516 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769491 2582 generic.go:358] "Generic (PLEG): container finished" podID="74787fd3-6aff-45fa-b4f4-4f97b01f0899" containerID="122fe6d94ecfcc3fce8c19cf5f98b6cce26f109cdaa7970bdbbae42d01f739e7" exitCode=1 Apr 16 18:17:41.769558 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769542 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"891cbbbff50e5197db5910b302c8e50ef8a671b3937d35eb2966db0ede5c8fd1"} Apr 16 18:17:41.769600 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769560 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"1f9b211463bf91bcf306146d3b430d0783f8d4ee02e5a7736e0a8ec7256d033e"} Apr 16 18:17:41.769600 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769569 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"76f780f9733d0e1f9a446a016875a20751b490160b5fa5c6ddea8b50b23664d0"} Apr 16 18:17:41.769600 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769578 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"6682e1f69f0edf352a855a3ca2854ed7ba2a98b5ed7d352d378743ac210169ab"} Apr 16 18:17:41.769600 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769586 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerDied","Data":"122fe6d94ecfcc3fce8c19cf5f98b6cce26f109cdaa7970bdbbae42d01f739e7"} Apr 16 18:17:41.769600 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.769595 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"09ec3c98cea0c532980ad5a51074cbba4ed06bd9e5afb4da8df4701776d651cd"} Apr 16 18:17:41.770759 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.770743 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-n229m" event={"ID":"bead58a1-c7d1-4221-8dba-7355ad1eee28","Type":"ContainerStarted","Data":"e22afde68d81cbf05dc0bf2e074decd8d36a7c5af4e79c70226bf8f4aa4e304f"} Apr 16 18:17:41.775355 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.775322 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cj779" podStartSLOduration=14.775310709 podStartE2EDuration="14.775310709s" podCreationTimestamp="2026-04-16 18:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:41.774788215 +0000 UTC m=+20.750904427" watchObservedRunningTime="2026-04-16 18:17:41.775310709 +0000 UTC m=+20.751426919" Apr 16 18:17:41.790373 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.790328 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-n229m" podStartSLOduration=3.067802194 podStartE2EDuration="20.790312444s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.838631233 +0000 UTC m=+1.814747427" lastFinishedPulling="2026-04-16 18:17:40.561141475 +0000 UTC m=+19.537257677" observedRunningTime="2026-04-16 18:17:41.790213915 +0000 UTC m=+20.766330126" watchObservedRunningTime="2026-04-16 18:17:41.790312444 +0000 UTC m=+20.766428657" Apr 16 18:17:41.836746 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.836686 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-4mngt" podStartSLOduration=3.09633828 podStartE2EDuration="20.836663993s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.831352571 +0000 UTC m=+1.807468760" lastFinishedPulling="2026-04-16 18:17:40.571678285 +0000 UTC m=+19.547794473" observedRunningTime="2026-04-16 18:17:41.836141267 +0000 UTC m=+20.812257479" watchObservedRunningTime="2026-04-16 18:17:41.836663993 +0000 UTC m=+20.812780208" Apr 16 18:17:41.853791 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.853738 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7t997" podStartSLOduration=3.152206929 podStartE2EDuration="20.853723008s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.859571238 +0000 UTC m=+1.835687428" lastFinishedPulling="2026-04-16 18:17:40.561087304 +0000 UTC m=+19.537203507" observedRunningTime="2026-04-16 18:17:41.853384965 +0000 UTC m=+20.829501177" watchObservedRunningTime="2026-04-16 18:17:41.853723008 +0000 UTC m=+20.829839219" Apr 16 18:17:41.872011 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.871969 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-v2mgc" podStartSLOduration=3.177451119 podStartE2EDuration="20.871953741s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.869653467 +0000 UTC m=+1.845769660" lastFinishedPulling="2026-04-16 18:17:40.564156078 +0000 UTC m=+19.540272282" observedRunningTime="2026-04-16 18:17:41.871702316 +0000 UTC m=+20.847818528" watchObservedRunningTime="2026-04-16 18:17:41.871953741 +0000 UTC m=+20.848069951" Apr 16 18:17:41.905894 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:41.905842 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-117.ec2.internal" podStartSLOduration=19.905822211 podStartE2EDuration="19.905822211s" podCreationTimestamp="2026-04-16 18:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:41.905289545 +0000 UTC m=+20.881405757" watchObservedRunningTime="2026-04-16 18:17:41.905822211 +0000 UTC m=+20.881938422" Apr 16 18:17:42.041750 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.041639 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:42.502055 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.502027 2582 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:17:42.556199 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.556056 2582 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:17:42.502048054Z","UUID":"4326a723-ba1f-4108-9f44-e464b82e7233","Handler":null,"Name":"","Endpoint":""} Apr 16 18:17:42.558989 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.558947 2582 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:17:42.558989 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.558990 2582 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:17:42.614945 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.614872 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:42.615088 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.614872 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:42.615088 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:42.615010 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:42.615190 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:42.615083 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:42.774586 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.774531 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgnq9" event={"ID":"14a34dbf-ea86-4998-809d-5078b679506c","Type":"ContainerStarted","Data":"19f2c4d977c7a76af0c6f689b083bf2ec9c6330e909437e109857dc67f88926c"} Apr 16 18:17:42.776706 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.776681 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" event={"ID":"38c6a73e51b044e7d224a339aafa2682","Type":"ContainerStarted","Data":"d9bd3b3a2aa0b71ad358c4e37ae9742eb8970834a9c5b352b1d140c9308331e8"} Apr 16 18:17:42.778630 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.778501 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" event={"ID":"f6f826d5-a016-4f49-8153-5a1a3cd21012","Type":"ContainerStarted","Data":"332ae2bc85a255ed5ebb284cebad8b9f7185d65b2406a95b28df8fb929a4b688"} Apr 16 18:17:42.808270 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.808148 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-117.ec2.internal" podStartSLOduration=20.808133462 podStartE2EDuration="20.808133462s" podCreationTimestamp="2026-04-16 18:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:42.807998823 +0000 UTC m=+21.784115035" watchObservedRunningTime="2026-04-16 18:17:42.808133462 +0000 UTC m=+21.784249672" Apr 16 18:17:42.808511 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:42.808480 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qgnq9" podStartSLOduration=4.098543998 podStartE2EDuration="21.808468599s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.851575101 +0000 UTC m=+1.827691296" lastFinishedPulling="2026-04-16 18:17:40.561499697 +0000 UTC m=+19.537615897" observedRunningTime="2026-04-16 18:17:42.791949012 +0000 UTC m=+21.768065225" watchObservedRunningTime="2026-04-16 18:17:42.808468599 +0000 UTC m=+21.784584811" Apr 16 18:17:43.783833 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:43.783804 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:17:43.784346 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:43.784244 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"3c33561e88c62f57292cff68cac54b8b296401f161abe9d87e808a0f4fbbade8"} Apr 16 18:17:43.786216 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:43.786180 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" event={"ID":"f6f826d5-a016-4f49-8153-5a1a3cd21012","Type":"ContainerStarted","Data":"52dc654fa161a963fa45b1658623a2c8b67d4e4d0fd0fccf99e2bfafa4bc6526"} Apr 16 18:17:43.804694 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:43.804661 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:43.805480 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:43.805452 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:43.823163 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:43.823112 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-p7h54" podStartSLOduration=2.450643317 podStartE2EDuration="22.823081813s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.865248219 +0000 UTC m=+1.841364408" lastFinishedPulling="2026-04-16 18:17:43.237686702 +0000 UTC m=+22.213802904" observedRunningTime="2026-04-16 18:17:43.807078272 +0000 UTC m=+22.783194486" watchObservedRunningTime="2026-04-16 18:17:43.823081813 +0000 UTC m=+22.799198025" Apr 16 18:17:44.614393 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:44.614358 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:44.614572 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:44.614406 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:44.614572 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:44.614494 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:44.614683 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:44.614573 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:44.788964 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:44.788922 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7t997" Apr 16 18:17:45.795505 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:45.794966 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:17:45.796393 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:45.795614 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"290ed806f7e7b0a91ffd9b8038bc4ffed1209944a61c20c3fb2d458812fb791e"} Apr 16 18:17:45.796393 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:45.795840 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:45.796393 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:45.795868 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:45.796393 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:45.796026 2582 scope.go:117] "RemoveContainer" containerID="122fe6d94ecfcc3fce8c19cf5f98b6cce26f109cdaa7970bdbbae42d01f739e7" Apr 16 18:17:45.815950 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:45.815690 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:45.822497 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:45.822470 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:46.614602 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.614568 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:46.614748 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.614571 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:46.614748 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:46.614669 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:46.614836 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:46.614750 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:46.802447 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.802409 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:17:46.803154 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.803086 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" event={"ID":"74787fd3-6aff-45fa-b4f4-4f97b01f0899","Type":"ContainerStarted","Data":"a4991c125e2832a9108e69a89db9ca9a0d5eade31c27bf6e3896692d46388b30"} Apr 16 18:17:46.803421 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.803379 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:17:46.804749 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.804724 2582 generic.go:358] "Generic (PLEG): container finished" podID="c19385c2-b1c2-45bc-a50b-91342bfe5265" containerID="d74e1ea0138093b2b7f9afb367d15df6ace0c88e21f92eec392e78ca86366d61" exitCode=0 Apr 16 18:17:46.804844 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.804761 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerDied","Data":"d74e1ea0138093b2b7f9afb367d15df6ace0c88e21f92eec392e78ca86366d61"} Apr 16 18:17:46.833844 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:46.833799 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" podStartSLOduration=7.815352902 podStartE2EDuration="25.833783782s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.855770139 +0000 UTC m=+1.831886343" lastFinishedPulling="2026-04-16 18:17:40.87420103 +0000 UTC m=+19.850317223" observedRunningTime="2026-04-16 18:17:46.832747526 +0000 UTC m=+25.808863738" watchObservedRunningTime="2026-04-16 18:17:46.833783782 +0000 UTC m=+25.809899994" Apr 16 18:17:47.679674 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:47.679407 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-crg6m"] Apr 16 18:17:47.679802 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:47.679700 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:47.679802 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:47.679791 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:47.683167 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:47.683142 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4zpgf"] Apr 16 18:17:47.683283 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:47.683258 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:47.683385 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:47.683366 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:47.807995 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:47.807961 2582 generic.go:358] "Generic (PLEG): container finished" podID="c19385c2-b1c2-45bc-a50b-91342bfe5265" containerID="dc7d4c735d3f545a619c1757146f010c2f51011c1f4dca84c9e1d72899aee0a4" exitCode=0 Apr 16 18:17:47.808496 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:47.808050 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerDied","Data":"dc7d4c735d3f545a619c1757146f010c2f51011c1f4dca84c9e1d72899aee0a4"} Apr 16 18:17:48.812325 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:48.812236 2582 generic.go:358] "Generic (PLEG): container finished" podID="c19385c2-b1c2-45bc-a50b-91342bfe5265" containerID="550e50167401045ad7836d0ea9d75921c3afcc36cdddab10832ada848e8a675d" exitCode=0 Apr 16 18:17:48.812325 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:48.812294 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerDied","Data":"550e50167401045ad7836d0ea9d75921c3afcc36cdddab10832ada848e8a675d"} Apr 16 18:17:49.614360 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:49.614329 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:49.614540 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:49.614330 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:49.614540 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:49.614471 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:49.614540 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:49.614529 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:51.615335 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:51.615306 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:51.615761 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:51.615402 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:51.615761 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:51.615439 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:51.615761 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:51.615460 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:53.614156 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.614111 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:53.614156 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.614163 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:53.614655 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:53.614283 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-crg6m" podUID="439470b0-687a-4bea-ad03-3eebe6cb41cd" Apr 16 18:17:53.614655 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:53.614358 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:17:53.859967 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.859885 2582 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-117.ec2.internal" event="NodeReady" Apr 16 18:17:53.860154 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.860035 2582 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:17:53.916757 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.916724 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-js745"] Apr 16 18:17:53.943956 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.943921 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vf74w"] Apr 16 18:17:53.944140 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.944126 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-js745" Apr 16 18:17:53.946929 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.946906 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:17:53.947133 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.946939 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m4v2s\"" Apr 16 18:17:53.947405 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.947386 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:17:53.967286 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.967251 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vf74w"] Apr 16 18:17:53.967286 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.967285 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-js745"] Apr 16 18:17:53.967479 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.967385 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:53.970478 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.970455 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:17:53.970478 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.970474 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:17:53.970704 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.970456 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v2cgj\"" Apr 16 18:17:53.970770 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:53.970738 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:17:54.033319 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.033279 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:54.033481 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.033351 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59ffg\" (UniqueName: \"kubernetes.io/projected/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-kube-api-access-59ffg\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.033481 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.033404 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbrvq\" (UniqueName: \"kubernetes.io/projected/0362b269-7b97-4579-a5a1-f882325a361a-kube-api-access-sbrvq\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:54.033617 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.033516 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-tmp-dir\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.033617 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.033563 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-config-volume\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.033724 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.033645 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.134713 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.134630 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59ffg\" (UniqueName: \"kubernetes.io/projected/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-kube-api-access-59ffg\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.134713 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.134680 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbrvq\" (UniqueName: \"kubernetes.io/projected/0362b269-7b97-4579-a5a1-f882325a361a-kube-api-access-sbrvq\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:54.134953 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.134715 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-tmp-dir\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.134953 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.134751 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-config-volume\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.134953 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.134916 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.134953 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.134942 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:54.135182 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.135058 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:54.135182 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.135071 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:54.135182 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.135118 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-tmp-dir\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.135182 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.135139 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:54.635118914 +0000 UTC m=+33.611235124 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:17:54.135182 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.135157 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:17:54.635149017 +0000 UTC m=+33.611265205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:17:54.140475 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.140453 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-config-volume\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.149271 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.149247 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59ffg\" (UniqueName: \"kubernetes.io/projected/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-kube-api-access-59ffg\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.149444 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.149421 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbrvq\" (UniqueName: \"kubernetes.io/projected/0362b269-7b97-4579-a5a1-f882325a361a-kube-api-access-sbrvq\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:54.336480 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.336435 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:54.336661 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.336590 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:54.336661 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.336654 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:26.336640855 +0000 UTC m=+65.312757044 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:54.437357 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.437325 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:54.437509 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.437489 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:54.437562 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.437514 2582 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:54.437562 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.437526 2582 projected.go:194] Error preparing data for projected volume kube-api-access-rzzsq for pod openshift-network-diagnostics/network-check-target-crg6m: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:54.437631 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.437576 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq podName:439470b0-687a-4bea-ad03-3eebe6cb41cd nodeName:}" failed. No retries permitted until 2026-04-16 18:18:26.43756176 +0000 UTC m=+65.413677949 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rzzsq" (UniqueName: "kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq") pod "network-check-target-crg6m" (UID: "439470b0-687a-4bea-ad03-3eebe6cb41cd") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:54.639261 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.639223 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:54.639261 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.639264 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:54.639779 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.639357 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:54.639779 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.639368 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:54.639779 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.639411 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:17:55.639397295 +0000 UTC m=+34.615513485 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:17:54.639779 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:54.639425 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:55.639419057 +0000 UTC m=+34.615535245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:17:54.826122 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:54.826072 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerStarted","Data":"b35da26c8a8d8fdb723a4621ad53f28e150d0adc1d78a63a2ca4b1ea56c3e733"} Apr 16 18:17:55.614165 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.614126 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:17:55.614466 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.614128 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:17:55.616944 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.616924 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:17:55.617114 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.617023 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:17:55.618049 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.618032 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-v57pk\"" Apr 16 18:17:55.618165 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.618085 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:17:55.618165 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.618156 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8x9mw\"" Apr 16 18:17:55.647066 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.647030 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:55.647066 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.647068 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:55.647539 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:55.647176 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:55.647539 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:55.647194 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:55.647539 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:55.647238 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:17:57.647223452 +0000 UTC m=+36.623339641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:17:55.647539 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:55.647252 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:57.647245612 +0000 UTC m=+36.623361801 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:17:55.829950 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.829918 2582 generic.go:358] "Generic (PLEG): container finished" podID="c19385c2-b1c2-45bc-a50b-91342bfe5265" containerID="b35da26c8a8d8fdb723a4621ad53f28e150d0adc1d78a63a2ca4b1ea56c3e733" exitCode=0 Apr 16 18:17:55.830126 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:55.829968 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerDied","Data":"b35da26c8a8d8fdb723a4621ad53f28e150d0adc1d78a63a2ca4b1ea56c3e733"} Apr 16 18:17:56.834454 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:56.834422 2582 generic.go:358] "Generic (PLEG): container finished" podID="c19385c2-b1c2-45bc-a50b-91342bfe5265" containerID="c441809e13de499fc94a17ba4ae943c79e93fdb04810167e4cd56c6b235a4ea1" exitCode=0 Apr 16 18:17:56.835023 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:56.834492 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerDied","Data":"c441809e13de499fc94a17ba4ae943c79e93fdb04810167e4cd56c6b235a4ea1"} Apr 16 18:17:57.663444 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:57.663411 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:17:57.663622 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:57.663450 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:17:57.663622 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:57.663564 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:57.663744 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:57.663631 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:01.663613075 +0000 UTC m=+40.639729264 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:17:57.663744 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:57.663652 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:57.663744 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:17:57.663713 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:01.663695179 +0000 UTC m=+40.639811371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:17:57.839020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:57.838979 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" event={"ID":"c19385c2-b1c2-45bc-a50b-91342bfe5265","Type":"ContainerStarted","Data":"5069ce867d905b7fbcdab41c9b5c4c8519451fd9e14dd4a0040c7e952c0bd55a"} Apr 16 18:17:57.884694 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:17:57.884537 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hpm7j" podStartSLOduration=5.050079539 podStartE2EDuration="36.884518832s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:17:22.845029453 +0000 UTC m=+1.821145642" lastFinishedPulling="2026-04-16 18:17:54.679468747 +0000 UTC m=+33.655584935" observedRunningTime="2026-04-16 18:17:57.884063386 +0000 UTC m=+36.860179607" watchObservedRunningTime="2026-04-16 18:17:57.884518832 +0000 UTC m=+36.860635044" Apr 16 18:18:01.691455 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:01.691407 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:18:01.691455 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:01.691452 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:18:01.691954 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:01.691544 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:01.691954 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:01.691552 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:01.691954 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:01.691608 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.691592289 +0000 UTC m=+48.667708478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:18:01.691954 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:01.691624 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:09.691617803 +0000 UTC m=+48.667733993 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:18:09.750843 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:09.750796 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:18:09.750843 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:09.750847 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:18:09.751406 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:09.750958 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:09.751406 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:09.751016 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:25.75100239 +0000 UTC m=+64.727118578 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:18:09.751406 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:09.750958 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:09.751406 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:09.751085 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:25.751072311 +0000 UTC m=+64.727188504 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:18:17.820197 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:17.820167 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2frx4" Apr 16 18:18:25.764609 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:25.764561 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:18:25.764609 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:25.764614 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:18:25.765187 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:25.764706 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:25.765187 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:25.764743 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:25.765187 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:25.764769 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:57.764753293 +0000 UTC m=+96.740869482 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:18:25.765187 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:25.764797 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:18:57.764781346 +0000 UTC m=+96.740897535 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:18:26.369164 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.369123 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:18:26.372068 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.372048 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:18:26.379425 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:26.379407 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:26.379489 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:26.379474 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:30.379457505 +0000 UTC m=+129.355573694 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : secret "metrics-daemon-secret" not found Apr 16 18:18:26.470337 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.470300 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:18:26.473017 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.472997 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:18:26.484235 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.484217 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:18:26.494241 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.494216 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzzsq\" (UniqueName: \"kubernetes.io/projected/439470b0-687a-4bea-ad03-3eebe6cb41cd-kube-api-access-rzzsq\") pod \"network-check-target-crg6m\" (UID: \"439470b0-687a-4bea-ad03-3eebe6cb41cd\") " pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:18:26.527466 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.527434 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-v57pk\"" Apr 16 18:18:26.534904 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.534883 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:18:26.725377 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.725344 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-crg6m"] Apr 16 18:18:26.729308 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:18:26.729277 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod439470b0_687a_4bea_ad03_3eebe6cb41cd.slice/crio-82b61e0dc4b9364cd17a071cd10dcadaea01b140f7a8ee128fca29a49662d6b6 WatchSource:0}: Error finding container 82b61e0dc4b9364cd17a071cd10dcadaea01b140f7a8ee128fca29a49662d6b6: Status 404 returned error can't find the container with id 82b61e0dc4b9364cd17a071cd10dcadaea01b140f7a8ee128fca29a49662d6b6 Apr 16 18:18:26.897942 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:26.897856 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-crg6m" event={"ID":"439470b0-687a-4bea-ad03-3eebe6cb41cd","Type":"ContainerStarted","Data":"82b61e0dc4b9364cd17a071cd10dcadaea01b140f7a8ee128fca29a49662d6b6"} Apr 16 18:18:29.905323 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:29.905286 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-crg6m" event={"ID":"439470b0-687a-4bea-ad03-3eebe6cb41cd","Type":"ContainerStarted","Data":"a58ca908344213e4c24732609744defca9d44d921d4b029844539f8764bde107"} Apr 16 18:18:29.905781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:29.905415 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:18:29.924000 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:29.923947 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-crg6m" podStartSLOduration=66.190462156 podStartE2EDuration="1m8.923933021s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:18:26.731149801 +0000 UTC m=+65.707265990" lastFinishedPulling="2026-04-16 18:18:29.464620666 +0000 UTC m=+68.440736855" observedRunningTime="2026-04-16 18:18:29.923882433 +0000 UTC m=+68.899998644" watchObservedRunningTime="2026-04-16 18:18:29.923933021 +0000 UTC m=+68.900049228" Apr 16 18:18:57.789224 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:57.789170 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:18:57.789224 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:18:57.789227 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:18:57.789735 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:57.789329 2582 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:57.789735 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:57.789395 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls podName:2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:01.789376053 +0000 UTC m=+160.765492241 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls") pod "dns-default-js745" (UID: "2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1") : secret "dns-default-metrics-tls" not found Apr 16 18:18:57.789735 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:57.789338 2582 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:57.789735 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:18:57.789485 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert podName:0362b269-7b97-4579-a5a1-f882325a361a nodeName:}" failed. No retries permitted until 2026-04-16 18:20:01.789468282 +0000 UTC m=+160.765584474 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert") pod "ingress-canary-vf74w" (UID: "0362b269-7b97-4579-a5a1-f882325a361a") : secret "canary-serving-cert" not found Apr 16 18:19:00.909446 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:00.909416 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-crg6m" Apr 16 18:19:30.415431 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:30.415387 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:19:30.415930 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:30.415537 2582 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:19:30.415930 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:30.415603 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs podName:847e2695-c897-4ed9-95c4-10d0fbef9e09 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:32.415585304 +0000 UTC m=+251.391701511 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs") pod "network-metrics-daemon-4zpgf" (UID: "847e2695-c897-4ed9-95c4-10d0fbef9e09") : secret "metrics-daemon-secret" not found Apr 16 18:19:47.550267 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.550230 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2"] Apr 16 18:19:47.553120 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.553085 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" Apr 16 18:19:47.555933 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.555909 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.556139 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.555967 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.556789 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.556772 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-zx4w2\"" Apr 16 18:19:47.560982 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.560957 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2"] Apr 16 18:19:47.649471 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.649434 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n"] Apr 16 18:19:47.652221 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.652197 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.656177 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.656156 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-66qlx\"" Apr 16 18:19:47.656385 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.656368 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 18:19:47.656464 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.656427 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.656464 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.656452 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.656584 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.656469 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 18:19:47.664646 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.664623 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n"] Apr 16 18:19:47.734546 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.734509 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kggxv\" (UniqueName: \"kubernetes.io/projected/ff19de9b-cc93-426d-9316-5a00b8359309-kube-api-access-kggxv\") pod \"volume-data-source-validator-7d955d5dd4-flgw2\" (UID: \"ff19de9b-cc93-426d-9316-5a00b8359309\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" Apr 16 18:19:47.761797 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.761769 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb"] Apr 16 18:19:47.764480 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.764464 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs"] Apr 16 18:19:47.764632 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.764614 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" Apr 16 18:19:47.767035 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.767017 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:47.772272 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:47.772244 2582 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"network-diagnostics-dockercfg-4dpsb\" is forbidden: User \"system:node:ip-10-0-139-117.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-network-diagnostics\": no relationship found between node 'ip-10-0-139-117.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4dpsb\"" type="*v1.Secret" Apr 16 18:19:47.772436 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.772420 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.772521 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.772421 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 18:19:47.772521 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.772444 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 18:19:47.772773 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.772760 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.772900 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.772885 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-v8kww\"" Apr 16 18:19:47.782002 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.781974 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kmx4l"] Apr 16 18:19:47.784295 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.784261 2582 status_manager.go:895] "Failed to get status for pod" podUID="15d17920-d56e-4995-bbc2-c4b5a72e3162" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" err="pods \"network-check-source-7b678d77c7-hzrtb\" is forbidden: User \"system:node:ip-10-0-139-117.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-network-diagnostics\": no relationship found between node 'ip-10-0-139-117.ec2.internal' and this object" Apr 16 18:19:47.784717 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.784699 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs"] Apr 16 18:19:47.784811 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.784792 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:47.785160 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.785059 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h"] Apr 16 18:19:47.787767 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.787748 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:47.788580 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.788564 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:19:47.788820 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.788804 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-76kbz\"" Apr 16 18:19:47.789300 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.789285 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:19:47.790899 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.790875 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb"] Apr 16 18:19:47.790998 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.790916 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.790998 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.790972 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:19:47.792009 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.791992 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.792129 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.792002 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.792201 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.792013 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-2k67j\"" Apr 16 18:19:47.792201 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.792193 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.792612 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.792594 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:19:47.799391 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.799367 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h"] Apr 16 18:19:47.811610 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.811536 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:19:47.811939 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.811909 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kmx4l"] Apr 16 18:19:47.835868 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.835825 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kggxv\" (UniqueName: \"kubernetes.io/projected/ff19de9b-cc93-426d-9316-5a00b8359309-kube-api-access-kggxv\") pod \"volume-data-source-validator-7d955d5dd4-flgw2\" (UID: \"ff19de9b-cc93-426d-9316-5a00b8359309\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" Apr 16 18:19:47.836054 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.835959 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.836054 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.835998 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.836199 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.836055 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4wx5\" (UniqueName: \"kubernetes.io/projected/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-kube-api-access-w4wx5\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.853360 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.853329 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kggxv\" (UniqueName: \"kubernetes.io/projected/ff19de9b-cc93-426d-9316-5a00b8359309-kube-api-access-kggxv\") pod \"volume-data-source-validator-7d955d5dd4-flgw2\" (UID: \"ff19de9b-cc93-426d-9316-5a00b8359309\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" Apr 16 18:19:47.861342 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.861307 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7"] Apr 16 18:19:47.861492 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.861474 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" Apr 16 18:19:47.864359 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.864344 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:47.867162 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.867141 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:47.867277 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.867203 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:19:47.867277 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.867255 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-87bv2\"" Apr 16 18:19:47.867688 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.867665 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:19:47.877406 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.877383 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7"] Apr 16 18:19:47.936552 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936397 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.936552 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936452 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqzmf\" (UniqueName: \"kubernetes.io/projected/ead2ae15-459e-4b99-898d-ae36578d9ffa-kube-api-access-mqzmf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:47.936552 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936495 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w4wx5\" (UniqueName: \"kubernetes.io/projected/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-kube-api-access-w4wx5\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.936552 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936545 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtmvf\" (UniqueName: \"kubernetes.io/projected/15d17920-d56e-4995-bbc2-c4b5a72e3162-kube-api-access-xtmvf\") pod \"network-check-source-7b678d77c7-hzrtb\" (UID: \"15d17920-d56e-4995-bbc2-c4b5a72e3162\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" Apr 16 18:19:47.936760 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936572 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ead2ae15-459e-4b99-898d-ae36578d9ffa-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:47.936760 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936602 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4257e-9483-4d05-bec2-a89b52ff2015-serving-cert\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:47.936760 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936633 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314b9f84-3289-4525-9f15-23803f5ec61a-config\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:47.936760 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936679 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vg9d\" (UniqueName: \"kubernetes.io/projected/16f4257e-9483-4d05-bec2-a89b52ff2015-kube-api-access-6vg9d\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:47.936760 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936713 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4257e-9483-4d05-bec2-a89b52ff2015-config\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:47.936760 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936739 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd7pp\" (UniqueName: \"kubernetes.io/projected/314b9f84-3289-4525-9f15-23803f5ec61a-kube-api-access-vd7pp\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:47.937027 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:47.936829 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:47.937027 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936872 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ead2ae15-459e-4b99-898d-ae36578d9ffa-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:47.937027 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:47.936893 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls podName:fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b nodeName:}" failed. No retries permitted until 2026-04-16 18:19:48.43687377 +0000 UTC m=+147.412989958 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9b99n" (UID: "fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:47.937027 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936924 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314b9f84-3289-4525-9f15-23803f5ec61a-serving-cert\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:47.937027 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936955 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.937027 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.936981 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f4257e-9483-4d05-bec2-a89b52ff2015-trusted-ca\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:47.937910 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.937887 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.946122 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.946074 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4wx5\" (UniqueName: \"kubernetes.io/projected/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-kube-api-access-w4wx5\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:47.982945 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:47.982911 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2"] Apr 16 18:19:47.986116 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:19:47.986071 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff19de9b_cc93_426d_9316_5a00b8359309.slice/crio-9eddff3d5129664b482577aef1e87f395d84d68d896ae104e57d03ac1065816b WatchSource:0}: Error finding container 9eddff3d5129664b482577aef1e87f395d84d68d896ae104e57d03ac1065816b: Status 404 returned error can't find the container with id 9eddff3d5129664b482577aef1e87f395d84d68d896ae104e57d03ac1065816b Apr 16 18:19:48.037594 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.037552 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314b9f84-3289-4525-9f15-23803f5ec61a-config\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:48.037781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.037608 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6vg9d\" (UniqueName: \"kubernetes.io/projected/16f4257e-9483-4d05-bec2-a89b52ff2015-kube-api-access-6vg9d\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.037781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.037640 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4257e-9483-4d05-bec2-a89b52ff2015-config\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.037781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.037656 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd7pp\" (UniqueName: \"kubernetes.io/projected/314b9f84-3289-4525-9f15-23803f5ec61a-kube-api-access-vd7pp\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:48.037781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.037693 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ead2ae15-459e-4b99-898d-ae36578d9ffa-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:48.037781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.037716 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314b9f84-3289-4525-9f15-23803f5ec61a-serving-cert\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:48.037781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.037743 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f4257e-9483-4d05-bec2-a89b52ff2015-trusted-ca\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.038195 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038165 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqzmf\" (UniqueName: \"kubernetes.io/projected/ead2ae15-459e-4b99-898d-ae36578d9ffa-kube-api-access-mqzmf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:48.038355 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038246 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xtmvf\" (UniqueName: \"kubernetes.io/projected/15d17920-d56e-4995-bbc2-c4b5a72e3162-kube-api-access-xtmvf\") pod \"network-check-source-7b678d77c7-hzrtb\" (UID: \"15d17920-d56e-4995-bbc2-c4b5a72e3162\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" Apr 16 18:19:48.038355 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038275 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ead2ae15-459e-4b99-898d-ae36578d9ffa-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:48.038355 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038282 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/314b9f84-3289-4525-9f15-23803f5ec61a-config\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:48.038355 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038309 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4257e-9483-4d05-bec2-a89b52ff2015-serving-cert\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.038355 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038320 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ead2ae15-459e-4b99-898d-ae36578d9ffa-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:48.038355 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038342 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:48.038649 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038374 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsrld\" (UniqueName: \"kubernetes.io/projected/6eb60089-5b1f-4d4a-a582-f110affcf229-kube-api-access-bsrld\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:48.038649 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038458 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16f4257e-9483-4d05-bec2-a89b52ff2015-config\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.038892 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.038867 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16f4257e-9483-4d05-bec2-a89b52ff2015-trusted-ca\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.040233 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.040213 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/314b9f84-3289-4525-9f15-23803f5ec61a-serving-cert\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:48.040399 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.040380 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ead2ae15-459e-4b99-898d-ae36578d9ffa-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:48.040554 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.040540 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16f4257e-9483-4d05-bec2-a89b52ff2015-serving-cert\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.047473 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.047443 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd7pp\" (UniqueName: \"kubernetes.io/projected/314b9f84-3289-4525-9f15-23803f5ec61a-kube-api-access-vd7pp\") pod \"service-ca-operator-69965bb79d-kxdgs\" (UID: \"314b9f84-3289-4525-9f15-23803f5ec61a\") " pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:48.047607 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.047589 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vg9d\" (UniqueName: \"kubernetes.io/projected/16f4257e-9483-4d05-bec2-a89b52ff2015-kube-api-access-6vg9d\") pod \"console-operator-d87b8d5fc-kmx4l\" (UID: \"16f4257e-9483-4d05-bec2-a89b52ff2015\") " pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.047653 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.047621 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtmvf\" (UniqueName: \"kubernetes.io/projected/15d17920-d56e-4995-bbc2-c4b5a72e3162-kube-api-access-xtmvf\") pod \"network-check-source-7b678d77c7-hzrtb\" (UID: \"15d17920-d56e-4995-bbc2-c4b5a72e3162\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" Apr 16 18:19:48.047778 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.047760 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqzmf\" (UniqueName: \"kubernetes.io/projected/ead2ae15-459e-4b99-898d-ae36578d9ffa-kube-api-access-mqzmf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-m887h\" (UID: \"ead2ae15-459e-4b99-898d-ae36578d9ffa\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:48.052954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.052929 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" event={"ID":"ff19de9b-cc93-426d-9316-5a00b8359309","Type":"ContainerStarted","Data":"9eddff3d5129664b482577aef1e87f395d84d68d896ae104e57d03ac1065816b"} Apr 16 18:19:48.080309 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.080225 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" Apr 16 18:19:48.095172 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.095147 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:48.100861 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.100834 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" Apr 16 18:19:48.139313 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.139282 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:48.139479 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.139318 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsrld\" (UniqueName: \"kubernetes.io/projected/6eb60089-5b1f-4d4a-a582-f110affcf229-kube-api-access-bsrld\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:48.139548 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:48.139503 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:48.139601 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:48.139576 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls podName:6eb60089-5b1f-4d4a-a582-f110affcf229 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:48.639556905 +0000 UTC m=+147.615673109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls") pod "cluster-samples-operator-667775844f-fsln7" (UID: "6eb60089-5b1f-4d4a-a582-f110affcf229") : secret "samples-operator-tls" not found Apr 16 18:19:48.153555 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.153487 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsrld\" (UniqueName: \"kubernetes.io/projected/6eb60089-5b1f-4d4a-a582-f110affcf229-kube-api-access-bsrld\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:48.213435 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.213402 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs"] Apr 16 18:19:48.217170 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:19:48.217077 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod314b9f84_3289_4525_9f15_23803f5ec61a.slice/crio-e409c5b10f7c47d03dea8c5061356c0c56bc06a25f6c8c569d23b44b99ea256f WatchSource:0}: Error finding container e409c5b10f7c47d03dea8c5061356c0c56bc06a25f6c8c569d23b44b99ea256f: Status 404 returned error can't find the container with id e409c5b10f7c47d03dea8c5061356c0c56bc06a25f6c8c569d23b44b99ea256f Apr 16 18:19:48.234514 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.234488 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h"] Apr 16 18:19:48.237022 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:19:48.236997 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podead2ae15_459e_4b99_898d_ae36578d9ffa.slice/crio-9914480ec76713ad42523385c612554b81918f01b71f37552f0c85cfb0c6d773 WatchSource:0}: Error finding container 9914480ec76713ad42523385c612554b81918f01b71f37552f0c85cfb0c6d773: Status 404 returned error can't find the container with id 9914480ec76713ad42523385c612554b81918f01b71f37552f0c85cfb0c6d773 Apr 16 18:19:48.249865 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.249841 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-kmx4l"] Apr 16 18:19:48.252671 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:19:48.252642 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f4257e_9483_4d05_bec2_a89b52ff2015.slice/crio-d622778d51477c96daec51a52fc23b83a6d439e71de2a5febaacce550acb9cf9 WatchSource:0}: Error finding container d622778d51477c96daec51a52fc23b83a6d439e71de2a5febaacce550acb9cf9: Status 404 returned error can't find the container with id d622778d51477c96daec51a52fc23b83a6d439e71de2a5febaacce550acb9cf9 Apr 16 18:19:48.442151 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.442121 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:48.442315 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:48.442231 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:48.442315 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:48.442294 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls podName:fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b nodeName:}" failed. No retries permitted until 2026-04-16 18:19:49.442276674 +0000 UTC m=+148.418392881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9b99n" (UID: "fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:48.618713 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.618683 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4dpsb\"" Apr 16 18:19:48.625154 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.625128 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" Apr 16 18:19:48.644339 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.644309 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:48.644497 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:48.644474 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:48.644588 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:48.644575 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls podName:6eb60089-5b1f-4d4a-a582-f110affcf229 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:49.644552678 +0000 UTC m=+148.620668882 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls") pod "cluster-samples-operator-667775844f-fsln7" (UID: "6eb60089-5b1f-4d4a-a582-f110affcf229") : secret "samples-operator-tls" not found Apr 16 18:19:48.762310 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:48.762277 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb"] Apr 16 18:19:48.765491 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:19:48.765453 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d17920_d56e_4995_bbc2_c4b5a72e3162.slice/crio-55eede19e0aec5cb5dcca1bd35e30d404ab77df02ca07ee44face9965cfcf501 WatchSource:0}: Error finding container 55eede19e0aec5cb5dcca1bd35e30d404ab77df02ca07ee44face9965cfcf501: Status 404 returned error can't find the container with id 55eede19e0aec5cb5dcca1bd35e30d404ab77df02ca07ee44face9965cfcf501 Apr 16 18:19:49.057830 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.057737 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" event={"ID":"16f4257e-9483-4d05-bec2-a89b52ff2015","Type":"ContainerStarted","Data":"d622778d51477c96daec51a52fc23b83a6d439e71de2a5febaacce550acb9cf9"} Apr 16 18:19:49.060168 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.060064 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" event={"ID":"15d17920-d56e-4995-bbc2-c4b5a72e3162","Type":"ContainerStarted","Data":"1d78147711b8d1ec55480284d1b0b834dfa6ebcfead0905066d6ab5904664d23"} Apr 16 18:19:49.060168 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.060122 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" event={"ID":"15d17920-d56e-4995-bbc2-c4b5a72e3162","Type":"ContainerStarted","Data":"55eede19e0aec5cb5dcca1bd35e30d404ab77df02ca07ee44face9965cfcf501"} Apr 16 18:19:49.063174 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.063115 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" event={"ID":"ead2ae15-459e-4b99-898d-ae36578d9ffa","Type":"ContainerStarted","Data":"9914480ec76713ad42523385c612554b81918f01b71f37552f0c85cfb0c6d773"} Apr 16 18:19:49.064855 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.064795 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" event={"ID":"314b9f84-3289-4525-9f15-23803f5ec61a","Type":"ContainerStarted","Data":"e409c5b10f7c47d03dea8c5061356c0c56bc06a25f6c8c569d23b44b99ea256f"} Apr 16 18:19:49.078725 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.077646 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-hzrtb" podStartSLOduration=2.07762783 podStartE2EDuration="2.07762783s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:49.077156296 +0000 UTC m=+148.053272508" watchObservedRunningTime="2026-04-16 18:19:49.07762783 +0000 UTC m=+148.053744044" Apr 16 18:19:49.450555 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.450513 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:49.450746 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:49.450675 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:49.450802 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:49.450747 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls podName:fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b nodeName:}" failed. No retries permitted until 2026-04-16 18:19:51.450726666 +0000 UTC m=+150.426842870 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9b99n" (UID: "fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:49.652471 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:49.652416 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:49.652875 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:49.652566 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:49.652875 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:49.652640 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls podName:6eb60089-5b1f-4d4a-a582-f110affcf229 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:51.652618905 +0000 UTC m=+150.628735109 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls") pod "cluster-samples-operator-667775844f-fsln7" (UID: "6eb60089-5b1f-4d4a-a582-f110affcf229") : secret "samples-operator-tls" not found Apr 16 18:19:50.070886 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:50.070828 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" event={"ID":"ff19de9b-cc93-426d-9316-5a00b8359309","Type":"ContainerStarted","Data":"fe9177043c84342c748ddec93b6211f6d7392036a5dd6ccb3ccdd9fac6229de6"} Apr 16 18:19:50.088300 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:50.088252 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-flgw2" podStartSLOduration=1.5161314209999999 podStartE2EDuration="3.088233802s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:19:47.987730998 +0000 UTC m=+146.963847189" lastFinishedPulling="2026-04-16 18:19:49.559833365 +0000 UTC m=+148.535949570" observedRunningTime="2026-04-16 18:19:50.087505642 +0000 UTC m=+149.063621853" watchObservedRunningTime="2026-04-16 18:19:50.088233802 +0000 UTC m=+149.064350013" Apr 16 18:19:51.464899 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:51.464847 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:51.465298 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:51.465041 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:51.465298 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:51.465146 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls podName:fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b nodeName:}" failed. No retries permitted until 2026-04-16 18:19:55.46512374 +0000 UTC m=+154.441239946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9b99n" (UID: "fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:51.666306 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:51.666274 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:51.666523 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:51.666395 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:51.666523 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:51.666445 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls podName:6eb60089-5b1f-4d4a-a582-f110affcf229 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:55.666431294 +0000 UTC m=+154.642547482 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls") pod "cluster-samples-operator-667775844f-fsln7" (UID: "6eb60089-5b1f-4d4a-a582-f110affcf229") : secret "samples-operator-tls" not found Apr 16 18:19:52.078521 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.078490 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/0.log" Apr 16 18:19:52.078717 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.078532 2582 generic.go:358] "Generic (PLEG): container finished" podID="16f4257e-9483-4d05-bec2-a89b52ff2015" containerID="c764c20e4a7a6ed1c92a1a320ef5e9285210134e74015e1179ac5683e714c478" exitCode=255 Apr 16 18:19:52.078717 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.078565 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" event={"ID":"16f4257e-9483-4d05-bec2-a89b52ff2015","Type":"ContainerDied","Data":"c764c20e4a7a6ed1c92a1a320ef5e9285210134e74015e1179ac5683e714c478"} Apr 16 18:19:52.078917 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.078884 2582 scope.go:117] "RemoveContainer" containerID="c764c20e4a7a6ed1c92a1a320ef5e9285210134e74015e1179ac5683e714c478" Apr 16 18:19:52.080112 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.080066 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" event={"ID":"ead2ae15-459e-4b99-898d-ae36578d9ffa","Type":"ContainerStarted","Data":"59f9082b598e7209265a90f73df5036b6e7e0ff051b774022f971701432a3157"} Apr 16 18:19:52.081524 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.081414 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" event={"ID":"314b9f84-3289-4525-9f15-23803f5ec61a","Type":"ContainerStarted","Data":"ed5e0ff5e2d9d263f9ec4eeeeb60a6af665beb56cf0b7a7d44a01e7e90d0e08c"} Apr 16 18:19:52.116307 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.116257 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" podStartSLOduration=1.910921277 podStartE2EDuration="5.116237935s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:19:48.238923522 +0000 UTC m=+147.215039712" lastFinishedPulling="2026-04-16 18:19:51.444240176 +0000 UTC m=+150.420356370" observedRunningTime="2026-04-16 18:19:52.115487773 +0000 UTC m=+151.091603986" watchObservedRunningTime="2026-04-16 18:19:52.116237935 +0000 UTC m=+151.092354147" Apr 16 18:19:52.136733 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.136678 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" podStartSLOduration=1.914696731 podStartE2EDuration="5.136659552s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:19:48.22063721 +0000 UTC m=+147.196753399" lastFinishedPulling="2026-04-16 18:19:51.442600017 +0000 UTC m=+150.418716220" observedRunningTime="2026-04-16 18:19:52.135249122 +0000 UTC m=+151.111365334" watchObservedRunningTime="2026-04-16 18:19:52.136659552 +0000 UTC m=+151.112775765" Apr 16 18:19:52.786862 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.786774 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2"] Apr 16 18:19:52.789615 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.789593 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" Apr 16 18:19:52.792059 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.792032 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-f66w9\"" Apr 16 18:19:52.792187 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.792061 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 18:19:52.792187 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.792137 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 18:19:52.798700 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.798678 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2"] Apr 16 18:19:52.875184 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.875152 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgzz7\" (UniqueName: \"kubernetes.io/projected/01d4cf30-f9cf-4378-9264-d1541f508a34-kube-api-access-rgzz7\") pod \"migrator-64d4d94569-bf6n2\" (UID: \"01d4cf30-f9cf-4378-9264-d1541f508a34\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" Apr 16 18:19:52.975894 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.975846 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rgzz7\" (UniqueName: \"kubernetes.io/projected/01d4cf30-f9cf-4378-9264-d1541f508a34-kube-api-access-rgzz7\") pod \"migrator-64d4d94569-bf6n2\" (UID: \"01d4cf30-f9cf-4378-9264-d1541f508a34\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" Apr 16 18:19:52.987570 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:52.987540 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgzz7\" (UniqueName: \"kubernetes.io/projected/01d4cf30-f9cf-4378-9264-d1541f508a34-kube-api-access-rgzz7\") pod \"migrator-64d4d94569-bf6n2\" (UID: \"01d4cf30-f9cf-4378-9264-d1541f508a34\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" Apr 16 18:19:53.084908 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.084832 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/1.log" Apr 16 18:19:53.085213 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.085198 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/0.log" Apr 16 18:19:53.085274 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.085232 2582 generic.go:358] "Generic (PLEG): container finished" podID="16f4257e-9483-4d05-bec2-a89b52ff2015" containerID="ac48bf4c80829fe9c11ae6587bd4c333aa5bf14eb9228e5c56a5610f009a7ac2" exitCode=255 Apr 16 18:19:53.085319 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.085260 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" event={"ID":"16f4257e-9483-4d05-bec2-a89b52ff2015","Type":"ContainerDied","Data":"ac48bf4c80829fe9c11ae6587bd4c333aa5bf14eb9228e5c56a5610f009a7ac2"} Apr 16 18:19:53.085319 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.085313 2582 scope.go:117] "RemoveContainer" containerID="c764c20e4a7a6ed1c92a1a320ef5e9285210134e74015e1179ac5683e714c478" Apr 16 18:19:53.085623 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.085598 2582 scope.go:117] "RemoveContainer" containerID="ac48bf4c80829fe9c11ae6587bd4c333aa5bf14eb9228e5c56a5610f009a7ac2" Apr 16 18:19:53.085829 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:53.085806 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kmx4l_openshift-console-operator(16f4257e-9483-4d05-bec2-a89b52ff2015)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" podUID="16f4257e-9483-4d05-bec2-a89b52ff2015" Apr 16 18:19:53.098816 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.098798 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" Apr 16 18:19:53.212290 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.212261 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cj779_f68745e4-3c2b-4cbe-80a9-80320d887584/dns-node-resolver/0.log" Apr 16 18:19:53.221173 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:53.221141 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2"] Apr 16 18:19:53.225422 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:19:53.225393 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d4cf30_f9cf_4378_9264_d1541f508a34.slice/crio-872bde3fd51348daf4380895d51d1935f7ea0d619fa9298f1c155d9266df0a59 WatchSource:0}: Error finding container 872bde3fd51348daf4380895d51d1935f7ea0d619fa9298f1c155d9266df0a59: Status 404 returned error can't find the container with id 872bde3fd51348daf4380895d51d1935f7ea0d619fa9298f1c155d9266df0a59 Apr 16 18:19:54.089946 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.089922 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/1.log" Apr 16 18:19:54.090392 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.090351 2582 scope.go:117] "RemoveContainer" containerID="ac48bf4c80829fe9c11ae6587bd4c333aa5bf14eb9228e5c56a5610f009a7ac2" Apr 16 18:19:54.090547 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:54.090528 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kmx4l_openshift-console-operator(16f4257e-9483-4d05-bec2-a89b52ff2015)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" podUID="16f4257e-9483-4d05-bec2-a89b52ff2015" Apr 16 18:19:54.090997 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.090976 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" event={"ID":"01d4cf30-f9cf-4378-9264-d1541f508a34","Type":"ContainerStarted","Data":"872bde3fd51348daf4380895d51d1935f7ea0d619fa9298f1c155d9266df0a59"} Apr 16 18:19:54.213108 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.213065 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n229m_bead58a1-c7d1-4221-8dba-7355ad1eee28/node-ca/0.log" Apr 16 18:19:54.822558 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.822522 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-2njtm"] Apr 16 18:19:54.825696 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.825674 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.828453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.828428 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 18:19:54.828453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.828450 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 18:19:54.828934 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.828917 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 18:19:54.829016 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.828918 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-rhvwd\"" Apr 16 18:19:54.829016 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.828921 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 18:19:54.839713 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.839689 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-2njtm"] Apr 16 18:19:54.892398 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.892363 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/26904c3e-36cd-470d-b9a6-e9070759a97f-signing-key\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.892565 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.892405 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2b79\" (UniqueName: \"kubernetes.io/projected/26904c3e-36cd-470d-b9a6-e9070759a97f-kube-api-access-n2b79\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.892565 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.892469 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/26904c3e-36cd-470d-b9a6-e9070759a97f-signing-cabundle\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.993053 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.993011 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2b79\" (UniqueName: \"kubernetes.io/projected/26904c3e-36cd-470d-b9a6-e9070759a97f-kube-api-access-n2b79\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.993237 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.993077 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/26904c3e-36cd-470d-b9a6-e9070759a97f-signing-cabundle\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.993237 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.993177 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/26904c3e-36cd-470d-b9a6-e9070759a97f-signing-key\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.993920 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.993890 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/26904c3e-36cd-470d-b9a6-e9070759a97f-signing-cabundle\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:54.995616 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:54.995596 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/26904c3e-36cd-470d-b9a6-e9070759a97f-signing-key\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:55.002996 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.002965 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2b79\" (UniqueName: \"kubernetes.io/projected/26904c3e-36cd-470d-b9a6-e9070759a97f-kube-api-access-n2b79\") pod \"service-ca-bfc587fb7-2njtm\" (UID: \"26904c3e-36cd-470d-b9a6-e9070759a97f\") " pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:55.095503 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.095418 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" event={"ID":"01d4cf30-f9cf-4378-9264-d1541f508a34","Type":"ContainerStarted","Data":"ed0b5f96ed0fa4092a2466b3cf80620c9fe526ef2bbb3dfc76828b8caedde6f8"} Apr 16 18:19:55.095503 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.095457 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" event={"ID":"01d4cf30-f9cf-4378-9264-d1541f508a34","Type":"ContainerStarted","Data":"77f3e1563d2169b907fdee5ff6ab55e37e199c783a5117fd5137596d85f70559"} Apr 16 18:19:55.112933 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.112888 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-bf6n2" podStartSLOduration=1.9692664359999998 podStartE2EDuration="3.112872799s" podCreationTimestamp="2026-04-16 18:19:52 +0000 UTC" firstStartedPulling="2026-04-16 18:19:53.227754679 +0000 UTC m=+152.203870881" lastFinishedPulling="2026-04-16 18:19:54.371361052 +0000 UTC m=+153.347477244" observedRunningTime="2026-04-16 18:19:55.11244118 +0000 UTC m=+154.088557391" watchObservedRunningTime="2026-04-16 18:19:55.112872799 +0000 UTC m=+154.088989009" Apr 16 18:19:55.134591 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.134557 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" Apr 16 18:19:55.258155 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.258118 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-2njtm"] Apr 16 18:19:55.261109 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:19:55.261071 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26904c3e_36cd_470d_b9a6_e9070759a97f.slice/crio-4cc5950a81834081a24c517c6298c5c0ce0009be7ec23dfc46f6710cf4210d52 WatchSource:0}: Error finding container 4cc5950a81834081a24c517c6298c5c0ce0009be7ec23dfc46f6710cf4210d52: Status 404 returned error can't find the container with id 4cc5950a81834081a24c517c6298c5c0ce0009be7ec23dfc46f6710cf4210d52 Apr 16 18:19:55.496690 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.496639 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:19:55.496888 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:55.496787 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:55.496888 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:55.496854 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls podName:fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b nodeName:}" failed. No retries permitted until 2026-04-16 18:20:03.49683844 +0000 UTC m=+162.472954630 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9b99n" (UID: "fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:19:55.698424 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:55.698380 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:19:55.698628 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:55.698530 2582 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:19:55.698628 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:55.698597 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls podName:6eb60089-5b1f-4d4a-a582-f110affcf229 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:03.698580921 +0000 UTC m=+162.674697110 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls") pod "cluster-samples-operator-667775844f-fsln7" (UID: "6eb60089-5b1f-4d4a-a582-f110affcf229") : secret "samples-operator-tls" not found Apr 16 18:19:56.099490 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:56.099454 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" event={"ID":"26904c3e-36cd-470d-b9a6-e9070759a97f","Type":"ContainerStarted","Data":"070117bc315eab17071d7326451d7efff615247cbcc0384b5aed9a150b18a0b0"} Apr 16 18:19:56.099490 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:56.099495 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" event={"ID":"26904c3e-36cd-470d-b9a6-e9070759a97f","Type":"ContainerStarted","Data":"4cc5950a81834081a24c517c6298c5c0ce0009be7ec23dfc46f6710cf4210d52"} Apr 16 18:19:56.116636 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:56.116587 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-2njtm" podStartSLOduration=2.11657257 podStartE2EDuration="2.11657257s" podCreationTimestamp="2026-04-16 18:19:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:19:56.115915256 +0000 UTC m=+155.092031466" watchObservedRunningTime="2026-04-16 18:19:56.11657257 +0000 UTC m=+155.092688778" Apr 16 18:19:56.955361 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:56.955312 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-js745" podUID="2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1" Apr 16 18:19:56.977893 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:56.977851 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-vf74w" podUID="0362b269-7b97-4579-a5a1-f882325a361a" Apr 16 18:19:57.102198 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:57.102167 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-js745" Apr 16 18:19:58.095660 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:58.095622 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:58.095660 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:58.095660 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:19:58.096041 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:19:58.096025 2582 scope.go:117] "RemoveContainer" containerID="ac48bf4c80829fe9c11ae6587bd4c333aa5bf14eb9228e5c56a5610f009a7ac2" Apr 16 18:19:58.096240 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:58.096224 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kmx4l_openshift-console-operator(16f4257e-9483-4d05-bec2-a89b52ff2015)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" podUID="16f4257e-9483-4d05-bec2-a89b52ff2015" Apr 16 18:19:58.628826 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:19:58.628781 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-4zpgf" podUID="847e2695-c897-4ed9-95c4-10d0fbef9e09" Apr 16 18:20:01.850020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:01.849931 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:20:01.850020 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:01.849973 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:20:01.852335 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:01.852302 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1-metrics-tls\") pod \"dns-default-js745\" (UID: \"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1\") " pod="openshift-dns/dns-default-js745" Apr 16 18:20:01.852565 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:01.852542 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0362b269-7b97-4579-a5a1-f882325a361a-cert\") pod \"ingress-canary-vf74w\" (UID: \"0362b269-7b97-4579-a5a1-f882325a361a\") " pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:20:01.906118 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:01.906070 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m4v2s\"" Apr 16 18:20:01.914218 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:01.914195 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-js745" Apr 16 18:20:02.036335 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:02.036302 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-js745"] Apr 16 18:20:02.039410 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:20:02.039383 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ea7a9c5_06e5_43bc_86f8_4ae8371bb1d1.slice/crio-f97f96d8347401a4eed5532b3251f73be4c5e211733419319abda2426e7b903d WatchSource:0}: Error finding container f97f96d8347401a4eed5532b3251f73be4c5e211733419319abda2426e7b903d: Status 404 returned error can't find the container with id f97f96d8347401a4eed5532b3251f73be4c5e211733419319abda2426e7b903d Apr 16 18:20:02.116372 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:02.116286 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-js745" event={"ID":"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1","Type":"ContainerStarted","Data":"f97f96d8347401a4eed5532b3251f73be4c5e211733419319abda2426e7b903d"} Apr 16 18:20:03.565214 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:03.565178 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:20:03.565583 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:20:03.565322 2582 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:03.565583 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:20:03.565388 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls podName:fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b nodeName:}" failed. No retries permitted until 2026-04-16 18:20:19.565371799 +0000 UTC m=+178.541487993 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-9b99n" (UID: "fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b") : secret "cluster-monitoring-operator-tls" not found Apr 16 18:20:03.767049 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:03.767000 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:20:03.776540 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:03.776512 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6eb60089-5b1f-4d4a-a582-f110affcf229-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-fsln7\" (UID: \"6eb60089-5b1f-4d4a-a582-f110affcf229\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:20:03.783024 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:03.783001 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" Apr 16 18:20:03.909297 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:03.909268 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7"] Apr 16 18:20:04.127407 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:04.127313 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" event={"ID":"6eb60089-5b1f-4d4a-a582-f110affcf229","Type":"ContainerStarted","Data":"9fb795bb7f95b519d4d9bfbcaddb5fbf1f12d2614e46835c3b1824f295c5eb83"} Apr 16 18:20:04.128936 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:04.128915 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-js745" event={"ID":"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1","Type":"ContainerStarted","Data":"fbadc7efc7756598aad664b464205f8f184a0c6e4ce7c5710745f112f1c882e2"} Apr 16 18:20:04.129034 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:04.128944 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-js745" event={"ID":"2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1","Type":"ContainerStarted","Data":"1de1d6a5249f7d0c64f751a375596adf3bdb0fb88aed351bf9991cf633e138b4"} Apr 16 18:20:04.129089 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:04.129036 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-js745" Apr 16 18:20:04.149760 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:04.149709 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-js745" podStartSLOduration=129.825044164 podStartE2EDuration="2m11.149694886s" podCreationTimestamp="2026-04-16 18:17:53 +0000 UTC" firstStartedPulling="2026-04-16 18:20:02.041315171 +0000 UTC m=+161.017431360" lastFinishedPulling="2026-04-16 18:20:03.36596588 +0000 UTC m=+162.342082082" observedRunningTime="2026-04-16 18:20:04.148934698 +0000 UTC m=+163.125050915" watchObservedRunningTime="2026-04-16 18:20:04.149694886 +0000 UTC m=+163.125811112" Apr 16 18:20:06.137401 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:06.137364 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" event={"ID":"6eb60089-5b1f-4d4a-a582-f110affcf229","Type":"ContainerStarted","Data":"914701884d699354996a9e04888f6c7c1d35492c71f84aebe53c1119b380449f"} Apr 16 18:20:06.137401 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:06.137405 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" event={"ID":"6eb60089-5b1f-4d4a-a582-f110affcf229","Type":"ContainerStarted","Data":"35aff842137a3809f8daa2ce5fc3c363b8da018054bf39efac3b8aa36fa9c24c"} Apr 16 18:20:06.160207 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:06.160138 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-fsln7" podStartSLOduration=17.528678454 podStartE2EDuration="19.160119515s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:20:03.959756961 +0000 UTC m=+162.935873149" lastFinishedPulling="2026-04-16 18:20:05.591198016 +0000 UTC m=+164.567314210" observedRunningTime="2026-04-16 18:20:06.159583407 +0000 UTC m=+165.135699618" watchObservedRunningTime="2026-04-16 18:20:06.160119515 +0000 UTC m=+165.136235738" Apr 16 18:20:07.618238 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:07.618207 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:20:07.621376 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:07.621357 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v2cgj\"" Apr 16 18:20:07.629405 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:07.629385 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vf74w" Apr 16 18:20:07.763722 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:07.763691 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vf74w"] Apr 16 18:20:07.767971 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:20:07.767936 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0362b269_7b97_4579_a5a1_f882325a361a.slice/crio-d525af37013199df3de9ed766f5546f7a5cee5a42ff89de7c8bd9f83381e5f78 WatchSource:0}: Error finding container d525af37013199df3de9ed766f5546f7a5cee5a42ff89de7c8bd9f83381e5f78: Status 404 returned error can't find the container with id d525af37013199df3de9ed766f5546f7a5cee5a42ff89de7c8bd9f83381e5f78 Apr 16 18:20:08.143662 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:08.143627 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vf74w" event={"ID":"0362b269-7b97-4579-a5a1-f882325a361a","Type":"ContainerStarted","Data":"d525af37013199df3de9ed766f5546f7a5cee5a42ff89de7c8bd9f83381e5f78"} Apr 16 18:20:09.614267 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:09.614172 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:20:10.150747 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:10.150711 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vf74w" event={"ID":"0362b269-7b97-4579-a5a1-f882325a361a","Type":"ContainerStarted","Data":"2d0c8da00664584ff5ca57ecd211859d09316063c89a1a173f9a9648173b3163"} Apr 16 18:20:10.168781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:10.168730 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vf74w" podStartSLOduration=135.626647327 podStartE2EDuration="2m17.168714901s" podCreationTimestamp="2026-04-16 18:17:53 +0000 UTC" firstStartedPulling="2026-04-16 18:20:07.76985781 +0000 UTC m=+166.745974002" lastFinishedPulling="2026-04-16 18:20:09.31192538 +0000 UTC m=+168.288041576" observedRunningTime="2026-04-16 18:20:10.167757985 +0000 UTC m=+169.143874196" watchObservedRunningTime="2026-04-16 18:20:10.168714901 +0000 UTC m=+169.144831111" Apr 16 18:20:12.614869 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:12.614830 2582 scope.go:117] "RemoveContainer" containerID="ac48bf4c80829fe9c11ae6587bd4c333aa5bf14eb9228e5c56a5610f009a7ac2" Apr 16 18:20:13.160998 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:13.160971 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:20:13.161351 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:13.161336 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/1.log" Apr 16 18:20:13.161412 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:13.161368 2582 generic.go:358] "Generic (PLEG): container finished" podID="16f4257e-9483-4d05-bec2-a89b52ff2015" containerID="7831a836032f476d71c00c5103600fb106aaa864d59698decc72d789bf9b3887" exitCode=255 Apr 16 18:20:13.161412 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:13.161395 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" event={"ID":"16f4257e-9483-4d05-bec2-a89b52ff2015","Type":"ContainerDied","Data":"7831a836032f476d71c00c5103600fb106aaa864d59698decc72d789bf9b3887"} Apr 16 18:20:13.161475 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:13.161421 2582 scope.go:117] "RemoveContainer" containerID="ac48bf4c80829fe9c11ae6587bd4c333aa5bf14eb9228e5c56a5610f009a7ac2" Apr 16 18:20:13.161757 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:13.161738 2582 scope.go:117] "RemoveContainer" containerID="7831a836032f476d71c00c5103600fb106aaa864d59698decc72d789bf9b3887" Apr 16 18:20:13.161940 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:20:13.161922 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kmx4l_openshift-console-operator(16f4257e-9483-4d05-bec2-a89b52ff2015)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" podUID="16f4257e-9483-4d05-bec2-a89b52ff2015" Apr 16 18:20:14.134955 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:14.134925 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-js745" Apr 16 18:20:14.165389 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:14.165361 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:20:15.300602 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.300566 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qx2kf"] Apr 16 18:20:15.335745 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.335716 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qx2kf"] Apr 16 18:20:15.335912 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.335843 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.344155 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.344127 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:20:15.344285 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.344156 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:20:15.344880 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.344856 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hm4gx\"" Apr 16 18:20:15.344949 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.344920 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:20:15.345160 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.345144 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:20:15.459198 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.459109 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/759c41ce-9647-447d-83db-cedf1e89428e-data-volume\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.459198 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.459162 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frm2c\" (UniqueName: \"kubernetes.io/projected/759c41ce-9647-447d-83db-cedf1e89428e-kube-api-access-frm2c\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.459198 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.459193 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/759c41ce-9647-447d-83db-cedf1e89428e-crio-socket\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.459409 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.459232 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/759c41ce-9647-447d-83db-cedf1e89428e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.459409 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.459262 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/759c41ce-9647-447d-83db-cedf1e89428e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.559827 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.559795 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/759c41ce-9647-447d-83db-cedf1e89428e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.560000 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.559842 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/759c41ce-9647-447d-83db-cedf1e89428e-data-volume\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.560000 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.559872 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frm2c\" (UniqueName: \"kubernetes.io/projected/759c41ce-9647-447d-83db-cedf1e89428e-kube-api-access-frm2c\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.560000 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.559900 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/759c41ce-9647-447d-83db-cedf1e89428e-crio-socket\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.560000 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.559933 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/759c41ce-9647-447d-83db-cedf1e89428e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.560213 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.560046 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/759c41ce-9647-447d-83db-cedf1e89428e-crio-socket\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.560287 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.560270 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/759c41ce-9647-447d-83db-cedf1e89428e-data-volume\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.560511 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.560496 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/759c41ce-9647-447d-83db-cedf1e89428e-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.562176 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.562157 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/759c41ce-9647-447d-83db-cedf1e89428e-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.569472 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.569448 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frm2c\" (UniqueName: \"kubernetes.io/projected/759c41ce-9647-447d-83db-cedf1e89428e-kube-api-access-frm2c\") pod \"insights-runtime-extractor-qx2kf\" (UID: \"759c41ce-9647-447d-83db-cedf1e89428e\") " pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.645165 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.645132 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qx2kf" Apr 16 18:20:15.768824 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:15.767321 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qx2kf"] Apr 16 18:20:15.772786 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:20:15.772750 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod759c41ce_9647_447d_83db_cedf1e89428e.slice/crio-56aa5fe24093912405d5fa1f6f8115630850c20793cf54fd1131cef1cbd36ed7 WatchSource:0}: Error finding container 56aa5fe24093912405d5fa1f6f8115630850c20793cf54fd1131cef1cbd36ed7: Status 404 returned error can't find the container with id 56aa5fe24093912405d5fa1f6f8115630850c20793cf54fd1131cef1cbd36ed7 Apr 16 18:20:16.171902 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:16.171866 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qx2kf" event={"ID":"759c41ce-9647-447d-83db-cedf1e89428e","Type":"ContainerStarted","Data":"ad85bdb8f8c23cd2e2e5d1b10d5f0ba9a4b906800fbcb43b923287e8e78ee775"} Apr 16 18:20:16.171902 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:16.171904 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qx2kf" event={"ID":"759c41ce-9647-447d-83db-cedf1e89428e","Type":"ContainerStarted","Data":"56aa5fe24093912405d5fa1f6f8115630850c20793cf54fd1131cef1cbd36ed7"} Apr 16 18:20:17.176493 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:17.176459 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qx2kf" event={"ID":"759c41ce-9647-447d-83db-cedf1e89428e","Type":"ContainerStarted","Data":"ad822df07e30515079333c519cb3a6f277a7e5d1b79ad5bd0e7103ab86148921"} Apr 16 18:20:18.095602 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:18.095565 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:20:18.095602 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:18.095610 2582 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:20:18.096045 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:18.096021 2582 scope.go:117] "RemoveContainer" containerID="7831a836032f476d71c00c5103600fb106aaa864d59698decc72d789bf9b3887" Apr 16 18:20:18.096250 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:20:18.096222 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kmx4l_openshift-console-operator(16f4257e-9483-4d05-bec2-a89b52ff2015)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" podUID="16f4257e-9483-4d05-bec2-a89b52ff2015" Apr 16 18:20:19.183340 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:19.183305 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qx2kf" event={"ID":"759c41ce-9647-447d-83db-cedf1e89428e","Type":"ContainerStarted","Data":"2ba3b807a95f59c57f1aa7a9a73684bd4b75b7dfe462689b665e509e4fe92564"} Apr 16 18:20:19.593251 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:19.593149 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:20:19.595504 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:19.595482 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-9b99n\" (UID: \"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:20:19.760814 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:19.760766 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" Apr 16 18:20:19.877700 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:19.877604 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qx2kf" podStartSLOduration=2.453807273 podStartE2EDuration="4.877585597s" podCreationTimestamp="2026-04-16 18:20:15 +0000 UTC" firstStartedPulling="2026-04-16 18:20:15.898563351 +0000 UTC m=+174.874679546" lastFinishedPulling="2026-04-16 18:20:18.322341668 +0000 UTC m=+177.298457870" observedRunningTime="2026-04-16 18:20:19.210203674 +0000 UTC m=+178.186319884" watchObservedRunningTime="2026-04-16 18:20:19.877585597 +0000 UTC m=+178.853701828" Apr 16 18:20:19.877842 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:19.877756 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n"] Apr 16 18:20:19.881017 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:20:19.880973 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb59922a_7f8a_4dfe_aeec_b44fcb1ba20b.slice/crio-d00f6000b533f1a2da4d4bf594d0512c77771cdfad6c23b1a45c4e60b102dd1a WatchSource:0}: Error finding container d00f6000b533f1a2da4d4bf594d0512c77771cdfad6c23b1a45c4e60b102dd1a: Status 404 returned error can't find the container with id d00f6000b533f1a2da4d4bf594d0512c77771cdfad6c23b1a45c4e60b102dd1a Apr 16 18:20:20.187546 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:20.187509 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" event={"ID":"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b","Type":"ContainerStarted","Data":"d00f6000b533f1a2da4d4bf594d0512c77771cdfad6c23b1a45c4e60b102dd1a"} Apr 16 18:20:22.193919 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:22.193877 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" event={"ID":"fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b","Type":"ContainerStarted","Data":"b8358cf14aa9ab2d63e57e5cbbedaa05250eb4eb43c731c7f564159b023eaeac"} Apr 16 18:20:22.212044 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:22.211994 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-9b99n" podStartSLOduration=33.702773153 podStartE2EDuration="35.211977374s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:20:19.884310498 +0000 UTC m=+178.860426687" lastFinishedPulling="2026-04-16 18:20:21.393514711 +0000 UTC m=+180.369630908" observedRunningTime="2026-04-16 18:20:22.211260195 +0000 UTC m=+181.187376408" watchObservedRunningTime="2026-04-16 18:20:22.211977374 +0000 UTC m=+181.188093636" Apr 16 18:20:29.615026 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:29.614988 2582 scope.go:117] "RemoveContainer" containerID="7831a836032f476d71c00c5103600fb106aaa864d59698decc72d789bf9b3887" Apr 16 18:20:29.615522 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:20:29.615277 2582 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-kmx4l_openshift-console-operator(16f4257e-9483-4d05-bec2-a89b52ff2015)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" podUID="16f4257e-9483-4d05-bec2-a89b52ff2015" Apr 16 18:20:30.384518 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.384485 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-lp8sq"] Apr 16 18:20:30.387538 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.387521 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.390434 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.390408 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:20:30.390791 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.390771 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:20:30.390872 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.390772 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:20:30.390930 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.390779 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:20:30.391529 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.391510 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k6cqr\"" Apr 16 18:20:30.475438 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475395 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-sys\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475603 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475463 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-textfile\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475603 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475521 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-accelerators-collector-config\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475603 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475560 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84b72a6f-e66f-473d-98f9-2c24e5660d4d-metrics-client-ca\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475770 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475662 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-root\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475770 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475708 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-tls\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475770 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475745 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475965 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475773 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ht64j\" (UniqueName: \"kubernetes.io/projected/84b72a6f-e66f-473d-98f9-2c24e5660d4d-kube-api-access-ht64j\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.475965 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.475812 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-wtmp\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576249 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576211 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-textfile\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576269 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-accelerators-collector-config\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576300 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84b72a6f-e66f-473d-98f9-2c24e5660d4d-metrics-client-ca\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576336 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-root\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576357 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-tls\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576390 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576432 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576415 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ht64j\" (UniqueName: \"kubernetes.io/projected/84b72a6f-e66f-473d-98f9-2c24e5660d4d-kube-api-access-ht64j\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576748 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576439 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-root\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576748 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576661 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-textfile\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576951 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576914 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-wtmp\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.576951 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.576975 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-sys\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.577190 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.577017 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84b72a6f-e66f-473d-98f9-2c24e5660d4d-metrics-client-ca\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.577190 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.577076 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-sys\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.577190 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.577133 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-accelerators-collector-config\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.577190 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.577177 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-wtmp\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.579016 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.578992 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.579251 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.579231 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/84b72a6f-e66f-473d-98f9-2c24e5660d4d-node-exporter-tls\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.587394 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.587367 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ht64j\" (UniqueName: \"kubernetes.io/projected/84b72a6f-e66f-473d-98f9-2c24e5660d4d-kube-api-access-ht64j\") pod \"node-exporter-lp8sq\" (UID: \"84b72a6f-e66f-473d-98f9-2c24e5660d4d\") " pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.697814 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:30.697778 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-lp8sq" Apr 16 18:20:30.706478 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:20:30.706448 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84b72a6f_e66f_473d_98f9_2c24e5660d4d.slice/crio-6bc1fdafbb0e0cb5023cc0af1d13b744125ba98785e997c0f03949a42c5ea798 WatchSource:0}: Error finding container 6bc1fdafbb0e0cb5023cc0af1d13b744125ba98785e997c0f03949a42c5ea798: Status 404 returned error can't find the container with id 6bc1fdafbb0e0cb5023cc0af1d13b744125ba98785e997c0f03949a42c5ea798 Apr 16 18:20:31.216917 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:31.216879 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lp8sq" event={"ID":"84b72a6f-e66f-473d-98f9-2c24e5660d4d","Type":"ContainerStarted","Data":"6bc1fdafbb0e0cb5023cc0af1d13b744125ba98785e997c0f03949a42c5ea798"} Apr 16 18:20:32.221056 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:32.221019 2582 generic.go:358] "Generic (PLEG): container finished" podID="84b72a6f-e66f-473d-98f9-2c24e5660d4d" containerID="57688bc172da6d586aac2bc34369ca5f6691feeeb15f2a8b5f4d056e9c96b9c2" exitCode=0 Apr 16 18:20:32.221588 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:32.221067 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lp8sq" event={"ID":"84b72a6f-e66f-473d-98f9-2c24e5660d4d","Type":"ContainerDied","Data":"57688bc172da6d586aac2bc34369ca5f6691feeeb15f2a8b5f4d056e9c96b9c2"} Apr 16 18:20:33.226453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:33.226415 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lp8sq" event={"ID":"84b72a6f-e66f-473d-98f9-2c24e5660d4d","Type":"ContainerStarted","Data":"58a2fb1cd5d2a2762031d1da4b203c35c6ecd0f19635377cae216afab8622365"} Apr 16 18:20:33.226453 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:33.226456 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-lp8sq" event={"ID":"84b72a6f-e66f-473d-98f9-2c24e5660d4d","Type":"ContainerStarted","Data":"57cc1d61f04014817be6b1802233a04343f0f3d34d86a59b7be1a01c2991bff4"} Apr 16 18:20:33.248537 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:33.248488 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-lp8sq" podStartSLOduration=2.3517066030000002 podStartE2EDuration="3.24847297s" podCreationTimestamp="2026-04-16 18:20:30 +0000 UTC" firstStartedPulling="2026-04-16 18:20:30.708129223 +0000 UTC m=+189.684245427" lastFinishedPulling="2026-04-16 18:20:31.604895592 +0000 UTC m=+190.581011794" observedRunningTime="2026-04-16 18:20:33.24736692 +0000 UTC m=+192.223483156" watchObservedRunningTime="2026-04-16 18:20:33.24847297 +0000 UTC m=+192.224589179" Apr 16 18:20:34.940655 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.940612 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-555fc6689c-q47b2"] Apr 16 18:20:34.943814 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.943797 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:34.946863 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.946832 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 18:20:34.948215 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.948182 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 18:20:34.948215 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.948203 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-pn2xn\"" Apr 16 18:20:34.948405 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.948271 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-kkdavsfrbusr\"" Apr 16 18:20:34.948405 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.948291 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 18:20:34.948509 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.948418 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:20:34.954710 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:34.954690 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-555fc6689c-q47b2"] Apr 16 18:20:35.015473 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.015436 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/bfc557f8-02ab-46e3-8b68-0db64836c12b-metrics-server-audit-profiles\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.015623 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.015494 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-secret-metrics-server-client-certs\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.015623 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.015512 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-secret-metrics-server-tls\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.015623 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.015541 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/bfc557f8-02ab-46e3-8b68-0db64836c12b-audit-log\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.015742 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.015618 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-client-ca-bundle\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.015742 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.015651 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfc557f8-02ab-46e3-8b68-0db64836c12b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.015742 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.015671 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkprd\" (UniqueName: \"kubernetes.io/projected/bfc557f8-02ab-46e3-8b68-0db64836c12b-kube-api-access-kkprd\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116320 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116290 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-secret-metrics-server-client-certs\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116320 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116322 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-secret-metrics-server-tls\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116556 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116348 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/bfc557f8-02ab-46e3-8b68-0db64836c12b-audit-log\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116556 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116365 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-client-ca-bundle\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116556 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116387 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfc557f8-02ab-46e3-8b68-0db64836c12b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116556 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116406 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkprd\" (UniqueName: \"kubernetes.io/projected/bfc557f8-02ab-46e3-8b68-0db64836c12b-kube-api-access-kkprd\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116556 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116472 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/bfc557f8-02ab-46e3-8b68-0db64836c12b-metrics-server-audit-profiles\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.116896 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.116863 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/bfc557f8-02ab-46e3-8b68-0db64836c12b-audit-log\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.117780 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.117753 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfc557f8-02ab-46e3-8b68-0db64836c12b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.118024 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.118005 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/bfc557f8-02ab-46e3-8b68-0db64836c12b-metrics-server-audit-profiles\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.119333 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.119312 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-secret-metrics-server-tls\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.119421 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.119359 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-client-ca-bundle\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.119685 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.119668 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/bfc557f8-02ab-46e3-8b68-0db64836c12b-secret-metrics-server-client-certs\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.132712 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.132690 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkprd\" (UniqueName: \"kubernetes.io/projected/bfc557f8-02ab-46e3-8b68-0db64836c12b-kube-api-access-kkprd\") pod \"metrics-server-555fc6689c-q47b2\" (UID: \"bfc557f8-02ab-46e3-8b68-0db64836c12b\") " pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.137543 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.137519 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x"] Apr 16 18:20:35.141634 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.141616 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:35.144891 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.144866 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 18:20:35.144998 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.144873 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-l4dz9\"" Apr 16 18:20:35.150161 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.150140 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x"] Apr 16 18:20:35.217406 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.217325 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c23ee750-8c36-4728-bd70-7a3ad044554d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-t857x\" (UID: \"c23ee750-8c36-4728-bd70-7a3ad044554d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:35.253490 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.253436 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:35.318578 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.318001 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c23ee750-8c36-4728-bd70-7a3ad044554d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-t857x\" (UID: \"c23ee750-8c36-4728-bd70-7a3ad044554d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:35.318578 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:20:35.318193 2582 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 16 18:20:35.318578 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:20:35.318259 2582 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c23ee750-8c36-4728-bd70-7a3ad044554d-monitoring-plugin-cert podName:c23ee750-8c36-4728-bd70-7a3ad044554d nodeName:}" failed. No retries permitted until 2026-04-16 18:20:35.818239172 +0000 UTC m=+194.794355367 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/c23ee750-8c36-4728-bd70-7a3ad044554d-monitoring-plugin-cert") pod "monitoring-plugin-5876b4bbc7-t857x" (UID: "c23ee750-8c36-4728-bd70-7a3ad044554d") : secret "monitoring-plugin-cert" not found Apr 16 18:20:35.382677 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.382643 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-555fc6689c-q47b2"] Apr 16 18:20:35.386376 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:20:35.386332 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfc557f8_02ab_46e3_8b68_0db64836c12b.slice/crio-d077d83b158036b748127e0c74b888172e3336e4f786dc94343d23ce4158a842 WatchSource:0}: Error finding container d077d83b158036b748127e0c74b888172e3336e4f786dc94343d23ce4158a842: Status 404 returned error can't find the container with id d077d83b158036b748127e0c74b888172e3336e4f786dc94343d23ce4158a842 Apr 16 18:20:35.822161 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.822118 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c23ee750-8c36-4728-bd70-7a3ad044554d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-t857x\" (UID: \"c23ee750-8c36-4728-bd70-7a3ad044554d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:35.824430 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:35.824409 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c23ee750-8c36-4728-bd70-7a3ad044554d-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-t857x\" (UID: \"c23ee750-8c36-4728-bd70-7a3ad044554d\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:36.051954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:36.051920 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:36.204231 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:36.204199 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x"] Apr 16 18:20:36.208176 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:20:36.208141 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc23ee750_8c36_4728_bd70_7a3ad044554d.slice/crio-cd3c0eddc87277e6289cbd2a18d8693e0aff57803cf89cbd8f2cf04ed26bd678 WatchSource:0}: Error finding container cd3c0eddc87277e6289cbd2a18d8693e0aff57803cf89cbd8f2cf04ed26bd678: Status 404 returned error can't find the container with id cd3c0eddc87277e6289cbd2a18d8693e0aff57803cf89cbd8f2cf04ed26bd678 Apr 16 18:20:36.236385 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:36.236330 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" event={"ID":"c23ee750-8c36-4728-bd70-7a3ad044554d","Type":"ContainerStarted","Data":"cd3c0eddc87277e6289cbd2a18d8693e0aff57803cf89cbd8f2cf04ed26bd678"} Apr 16 18:20:36.237690 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:36.237656 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" event={"ID":"bfc557f8-02ab-46e3-8b68-0db64836c12b","Type":"ContainerStarted","Data":"d077d83b158036b748127e0c74b888172e3336e4f786dc94343d23ce4158a842"} Apr 16 18:20:37.241750 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:37.241714 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" event={"ID":"bfc557f8-02ab-46e3-8b68-0db64836c12b","Type":"ContainerStarted","Data":"e272a72dd62197e2c4a223858e110374bd46c640768966a4bb52b8809dac4b68"} Apr 16 18:20:37.261004 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:37.260951 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" podStartSLOduration=1.7987968429999999 podStartE2EDuration="3.260936764s" podCreationTimestamp="2026-04-16 18:20:34 +0000 UTC" firstStartedPulling="2026-04-16 18:20:35.38843063 +0000 UTC m=+194.364546836" lastFinishedPulling="2026-04-16 18:20:36.850570552 +0000 UTC m=+195.826686757" observedRunningTime="2026-04-16 18:20:37.260211891 +0000 UTC m=+196.236328102" watchObservedRunningTime="2026-04-16 18:20:37.260936764 +0000 UTC m=+196.237052974" Apr 16 18:20:38.246108 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:38.246064 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" event={"ID":"c23ee750-8c36-4728-bd70-7a3ad044554d","Type":"ContainerStarted","Data":"8e7fbcbc07f86ad6965703d72b6bdd6053ba7c0e12fe6a434060eb1de4e86f31"} Apr 16 18:20:38.246552 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:38.246310 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:38.251048 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:38.251020 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" Apr 16 18:20:38.264912 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:38.264842 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-t857x" podStartSLOduration=1.739190674 podStartE2EDuration="3.264826951s" podCreationTimestamp="2026-04-16 18:20:35 +0000 UTC" firstStartedPulling="2026-04-16 18:20:36.210790369 +0000 UTC m=+195.186906561" lastFinishedPulling="2026-04-16 18:20:37.736426636 +0000 UTC m=+196.712542838" observedRunningTime="2026-04-16 18:20:38.263845022 +0000 UTC m=+197.239961234" watchObservedRunningTime="2026-04-16 18:20:38.264826951 +0000 UTC m=+197.240943161" Apr 16 18:20:43.615119 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:43.615060 2582 scope.go:117] "RemoveContainer" containerID="7831a836032f476d71c00c5103600fb106aaa864d59698decc72d789bf9b3887" Apr 16 18:20:44.268230 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:44.268198 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:20:44.268397 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:44.268258 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" event={"ID":"16f4257e-9483-4d05-bec2-a89b52ff2015","Type":"ContainerStarted","Data":"5c9ae448e30d388c1ab8d8a93daf46ca694de52bfa8c62376935b98b5a4fd401"} Apr 16 18:20:44.268553 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:44.268535 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:20:44.279862 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:44.279812 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" Apr 16 18:20:44.292064 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:44.292009 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-kmx4l" podStartSLOduration=54.10738153 podStartE2EDuration="57.291995051s" podCreationTimestamp="2026-04-16 18:19:47 +0000 UTC" firstStartedPulling="2026-04-16 18:19:48.254273457 +0000 UTC m=+147.230389659" lastFinishedPulling="2026-04-16 18:19:51.438886991 +0000 UTC m=+150.415003180" observedRunningTime="2026-04-16 18:20:44.291228043 +0000 UTC m=+203.267344279" watchObservedRunningTime="2026-04-16 18:20:44.291995051 +0000 UTC m=+203.268111261" Apr 16 18:20:55.254428 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:55.254391 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:20:55.254428 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:20:55.254433 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:21:04.379971 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.379940 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5489cdfb5c-chtgh"] Apr 16 18:21:04.382104 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.382075 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.384597 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.384575 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:21:04.385709 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.385688 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:21:04.386065 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.386030 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8jw5b\"" Apr 16 18:21:04.386191 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.386037 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:21:04.386824 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.386807 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:21:04.386999 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.386978 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:21:04.387125 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.387037 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:21:04.387305 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.387289 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:21:04.391486 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.391466 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:21:04.392118 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.392084 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5489cdfb5c-chtgh"] Apr 16 18:21:04.448465 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.448428 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzdpm\" (UniqueName: \"kubernetes.io/projected/ad47d22c-88db-4f84-8239-fd6eb01e8a33-kube-api-access-vzdpm\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.448663 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.448484 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-config\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.448663 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.448506 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-oauth-config\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.448663 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.448550 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-serving-cert\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.448831 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.448659 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-service-ca\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.448831 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.448708 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-trusted-ca-bundle\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.448831 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.448742 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-oauth-serving-cert\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.549570 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.549524 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-config\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.549570 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.549575 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-oauth-config\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.549826 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.549637 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-serving-cert\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.549826 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.549696 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-service-ca\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.549826 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.549723 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-trusted-ca-bundle\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.549826 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.549739 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-oauth-serving-cert\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.549826 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.549796 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzdpm\" (UniqueName: \"kubernetes.io/projected/ad47d22c-88db-4f84-8239-fd6eb01e8a33-kube-api-access-vzdpm\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.550546 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.550522 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-config\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.550662 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.550595 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-service-ca\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.550731 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.550684 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-oauth-serving-cert\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.550731 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.550699 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-trusted-ca-bundle\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.552249 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.552227 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-oauth-config\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.552320 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.552279 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-serving-cert\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.559879 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.559855 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzdpm\" (UniqueName: \"kubernetes.io/projected/ad47d22c-88db-4f84-8239-fd6eb01e8a33-kube-api-access-vzdpm\") pod \"console-5489cdfb5c-chtgh\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.691975 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.691937 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:04.813317 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:04.813282 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5489cdfb5c-chtgh"] Apr 16 18:21:04.815908 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:21:04.815884 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad47d22c_88db_4f84_8239_fd6eb01e8a33.slice/crio-a24b31163bad10db0288a75e4eea2aec69a9687d6ece098f0c87011fc69bbbfb WatchSource:0}: Error finding container a24b31163bad10db0288a75e4eea2aec69a9687d6ece098f0c87011fc69bbbfb: Status 404 returned error can't find the container with id a24b31163bad10db0288a75e4eea2aec69a9687d6ece098f0c87011fc69bbbfb Apr 16 18:21:05.328456 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:05.328416 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5489cdfb5c-chtgh" event={"ID":"ad47d22c-88db-4f84-8239-fd6eb01e8a33","Type":"ContainerStarted","Data":"a24b31163bad10db0288a75e4eea2aec69a9687d6ece098f0c87011fc69bbbfb"} Apr 16 18:21:08.337820 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:08.337789 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5489cdfb5c-chtgh" event={"ID":"ad47d22c-88db-4f84-8239-fd6eb01e8a33","Type":"ContainerStarted","Data":"a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269"} Apr 16 18:21:08.358709 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:08.358653 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5489cdfb5c-chtgh" podStartSLOduration=1.7485561490000001 podStartE2EDuration="4.358637216s" podCreationTimestamp="2026-04-16 18:21:04 +0000 UTC" firstStartedPulling="2026-04-16 18:21:04.817722605 +0000 UTC m=+223.793838793" lastFinishedPulling="2026-04-16 18:21:07.427803657 +0000 UTC m=+226.403919860" observedRunningTime="2026-04-16 18:21:08.357456258 +0000 UTC m=+227.333572480" watchObservedRunningTime="2026-04-16 18:21:08.358637216 +0000 UTC m=+227.334753427" Apr 16 18:21:12.349499 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:12.349463 2582 generic.go:358] "Generic (PLEG): container finished" podID="ead2ae15-459e-4b99-898d-ae36578d9ffa" containerID="59f9082b598e7209265a90f73df5036b6e7e0ff051b774022f971701432a3157" exitCode=0 Apr 16 18:21:12.349937 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:12.349536 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" event={"ID":"ead2ae15-459e-4b99-898d-ae36578d9ffa","Type":"ContainerDied","Data":"59f9082b598e7209265a90f73df5036b6e7e0ff051b774022f971701432a3157"} Apr 16 18:21:12.349937 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:12.349879 2582 scope.go:117] "RemoveContainer" containerID="59f9082b598e7209265a90f73df5036b6e7e0ff051b774022f971701432a3157" Apr 16 18:21:12.350890 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:12.350865 2582 generic.go:358] "Generic (PLEG): container finished" podID="314b9f84-3289-4525-9f15-23803f5ec61a" containerID="ed5e0ff5e2d9d263f9ec4eeeeb60a6af665beb56cf0b7a7d44a01e7e90d0e08c" exitCode=0 Apr 16 18:21:12.350951 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:12.350905 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" event={"ID":"314b9f84-3289-4525-9f15-23803f5ec61a","Type":"ContainerDied","Data":"ed5e0ff5e2d9d263f9ec4eeeeb60a6af665beb56cf0b7a7d44a01e7e90d0e08c"} Apr 16 18:21:12.351177 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:12.351163 2582 scope.go:117] "RemoveContainer" containerID="ed5e0ff5e2d9d263f9ec4eeeeb60a6af665beb56cf0b7a7d44a01e7e90d0e08c" Apr 16 18:21:13.354781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:13.354743 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-m887h" event={"ID":"ead2ae15-459e-4b99-898d-ae36578d9ffa","Type":"ContainerStarted","Data":"c5abe9bf0d7983b73ccc38004678a633905101eb4411906b88458350e373e3fa"} Apr 16 18:21:13.356409 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:13.356386 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69965bb79d-kxdgs" event={"ID":"314b9f84-3289-4525-9f15-23803f5ec61a","Type":"ContainerStarted","Data":"5f27b8ea08f410ed6d32037bb88d3f3985c2acfbe1b34406d38c6a5090f7595b"} Apr 16 18:21:14.692334 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:14.692282 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:14.692334 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:14.692342 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:14.697346 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:14.697321 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:15.259869 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:15.259841 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:21:15.263965 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:15.263941 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-555fc6689c-q47b2" Apr 16 18:21:15.366705 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:15.366679 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:21:16.856741 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:16.856713 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-js745_2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1/dns/0.log" Apr 16 18:21:16.863764 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:16.863737 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-js745_2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1/kube-rbac-proxy/0.log" Apr 16 18:21:17.237840 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:17.237815 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cj779_f68745e4-3c2b-4cbe-80a9-80320d887584/dns-node-resolver/0.log" Apr 16 18:21:32.486780 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:32.486690 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:21:32.489013 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:32.488992 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/847e2695-c897-4ed9-95c4-10d0fbef9e09-metrics-certs\") pod \"network-metrics-daemon-4zpgf\" (UID: \"847e2695-c897-4ed9-95c4-10d0fbef9e09\") " pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:21:32.717610 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:32.717579 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8x9mw\"" Apr 16 18:21:32.725720 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:32.725700 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4zpgf" Apr 16 18:21:32.846516 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:32.846491 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4zpgf"] Apr 16 18:21:32.849206 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:21:32.849174 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod847e2695_c897_4ed9_95c4_10d0fbef9e09.slice/crio-364ff5795263d75f5b646cbc54c0c52c513f282d5fb61876c64e8d547bb13e8a WatchSource:0}: Error finding container 364ff5795263d75f5b646cbc54c0c52c513f282d5fb61876c64e8d547bb13e8a: Status 404 returned error can't find the container with id 364ff5795263d75f5b646cbc54c0c52c513f282d5fb61876c64e8d547bb13e8a Apr 16 18:21:33.416001 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:33.415961 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4zpgf" event={"ID":"847e2695-c897-4ed9-95c4-10d0fbef9e09","Type":"ContainerStarted","Data":"364ff5795263d75f5b646cbc54c0c52c513f282d5fb61876c64e8d547bb13e8a"} Apr 16 18:21:34.424850 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:34.424804 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4zpgf" event={"ID":"847e2695-c897-4ed9-95c4-10d0fbef9e09","Type":"ContainerStarted","Data":"727a756c37e4abf9fa9b00f400d2d1a157e6402e18e7d5248a25940a84c06741"} Apr 16 18:21:34.424850 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:34.424852 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4zpgf" event={"ID":"847e2695-c897-4ed9-95c4-10d0fbef9e09","Type":"ContainerStarted","Data":"3ba69b3080b582da0910f148cd9c8b3dbce10bebf6625f0deb87464c0f245ed4"} Apr 16 18:21:34.445285 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:21:34.445232 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4zpgf" podStartSLOduration=252.567373203 podStartE2EDuration="4m13.445216426s" podCreationTimestamp="2026-04-16 18:17:21 +0000 UTC" firstStartedPulling="2026-04-16 18:21:32.851046133 +0000 UTC m=+251.827162322" lastFinishedPulling="2026-04-16 18:21:33.728889339 +0000 UTC m=+252.705005545" observedRunningTime="2026-04-16 18:21:34.444945749 +0000 UTC m=+253.421061961" watchObservedRunningTime="2026-04-16 18:21:34.445216426 +0000 UTC m=+253.421332631" Apr 16 18:22:01.509334 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:01.509301 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5489cdfb5c-chtgh"] Apr 16 18:22:21.500255 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:21.500230 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:22:21.501288 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:21.501265 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:22:21.505537 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:21.505507 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:22:21.506450 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:21.506427 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:22:21.510778 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:21.510760 2582 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:22:26.528750 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.528677 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5489cdfb5c-chtgh" podUID="ad47d22c-88db-4f84-8239-fd6eb01e8a33" containerName="console" containerID="cri-o://a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269" gracePeriod=15 Apr 16 18:22:26.771905 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.771875 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5489cdfb5c-chtgh_ad47d22c-88db-4f84-8239-fd6eb01e8a33/console/0.log" Apr 16 18:22:26.772041 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.771940 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:22:26.815620 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.815531 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzdpm\" (UniqueName: \"kubernetes.io/projected/ad47d22c-88db-4f84-8239-fd6eb01e8a33-kube-api-access-vzdpm\") pod \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " Apr 16 18:22:26.815620 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.815582 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-config\") pod \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " Apr 16 18:22:26.815620 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.815605 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-trusted-ca-bundle\") pod \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " Apr 16 18:22:26.815890 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.815651 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-service-ca\") pod \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " Apr 16 18:22:26.815890 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.815672 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-oauth-config\") pod \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " Apr 16 18:22:26.815890 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.815713 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-serving-cert\") pod \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " Apr 16 18:22:26.815890 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.815739 2582 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-oauth-serving-cert\") pod \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\" (UID: \"ad47d22c-88db-4f84-8239-fd6eb01e8a33\") " Apr 16 18:22:26.816220 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.816183 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-service-ca" (OuterVolumeSpecName: "service-ca") pod "ad47d22c-88db-4f84-8239-fd6eb01e8a33" (UID: "ad47d22c-88db-4f84-8239-fd6eb01e8a33"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:26.816296 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.816194 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-config" (OuterVolumeSpecName: "console-config") pod "ad47d22c-88db-4f84-8239-fd6eb01e8a33" (UID: "ad47d22c-88db-4f84-8239-fd6eb01e8a33"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:26.816341 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.816317 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ad47d22c-88db-4f84-8239-fd6eb01e8a33" (UID: "ad47d22c-88db-4f84-8239-fd6eb01e8a33"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:26.816515 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.816491 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ad47d22c-88db-4f84-8239-fd6eb01e8a33" (UID: "ad47d22c-88db-4f84-8239-fd6eb01e8a33"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:22:26.817857 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.817830 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad47d22c-88db-4f84-8239-fd6eb01e8a33-kube-api-access-vzdpm" (OuterVolumeSpecName: "kube-api-access-vzdpm") pod "ad47d22c-88db-4f84-8239-fd6eb01e8a33" (UID: "ad47d22c-88db-4f84-8239-fd6eb01e8a33"). InnerVolumeSpecName "kube-api-access-vzdpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:22:26.817954 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.817903 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ad47d22c-88db-4f84-8239-fd6eb01e8a33" (UID: "ad47d22c-88db-4f84-8239-fd6eb01e8a33"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:26.818268 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.818253 2582 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ad47d22c-88db-4f84-8239-fd6eb01e8a33" (UID: "ad47d22c-88db-4f84-8239-fd6eb01e8a33"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:22:26.916851 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.916812 2582 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-serving-cert\") on node \"ip-10-0-139-117.ec2.internal\" DevicePath \"\"" Apr 16 18:22:26.916851 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.916846 2582 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-oauth-serving-cert\") on node \"ip-10-0-139-117.ec2.internal\" DevicePath \"\"" Apr 16 18:22:26.916851 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.916857 2582 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vzdpm\" (UniqueName: \"kubernetes.io/projected/ad47d22c-88db-4f84-8239-fd6eb01e8a33-kube-api-access-vzdpm\") on node \"ip-10-0-139-117.ec2.internal\" DevicePath \"\"" Apr 16 18:22:26.917089 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.916866 2582 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-config\") on node \"ip-10-0-139-117.ec2.internal\" DevicePath \"\"" Apr 16 18:22:26.917089 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.916875 2582 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-trusted-ca-bundle\") on node \"ip-10-0-139-117.ec2.internal\" DevicePath \"\"" Apr 16 18:22:26.917089 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.916883 2582 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ad47d22c-88db-4f84-8239-fd6eb01e8a33-service-ca\") on node \"ip-10-0-139-117.ec2.internal\" DevicePath \"\"" Apr 16 18:22:26.917089 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:26.916892 2582 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ad47d22c-88db-4f84-8239-fd6eb01e8a33-console-oauth-config\") on node \"ip-10-0-139-117.ec2.internal\" DevicePath \"\"" Apr 16 18:22:27.573065 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.573039 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5489cdfb5c-chtgh_ad47d22c-88db-4f84-8239-fd6eb01e8a33/console/0.log" Apr 16 18:22:27.573510 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.573080 2582 generic.go:358] "Generic (PLEG): container finished" podID="ad47d22c-88db-4f84-8239-fd6eb01e8a33" containerID="a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269" exitCode=2 Apr 16 18:22:27.573510 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.573128 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5489cdfb5c-chtgh" event={"ID":"ad47d22c-88db-4f84-8239-fd6eb01e8a33","Type":"ContainerDied","Data":"a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269"} Apr 16 18:22:27.573510 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.573172 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5489cdfb5c-chtgh" Apr 16 18:22:27.573510 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.573186 2582 scope.go:117] "RemoveContainer" containerID="a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269" Apr 16 18:22:27.573510 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.573173 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5489cdfb5c-chtgh" event={"ID":"ad47d22c-88db-4f84-8239-fd6eb01e8a33","Type":"ContainerDied","Data":"a24b31163bad10db0288a75e4eea2aec69a9687d6ece098f0c87011fc69bbbfb"} Apr 16 18:22:27.581431 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.581409 2582 scope.go:117] "RemoveContainer" containerID="a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269" Apr 16 18:22:27.581708 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:22:27.581684 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269\": container with ID starting with a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269 not found: ID does not exist" containerID="a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269" Apr 16 18:22:27.581756 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.581717 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269"} err="failed to get container status \"a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269\": rpc error: code = NotFound desc = could not find container \"a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269\": container with ID starting with a5ca94078eea4243d1ce7558dae642ef87710be681573d192eafa1195e978269 not found: ID does not exist" Apr 16 18:22:27.603686 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.603657 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5489cdfb5c-chtgh"] Apr 16 18:22:27.611272 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.611248 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5489cdfb5c-chtgh"] Apr 16 18:22:27.618839 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:22:27.618813 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad47d22c-88db-4f84-8239-fd6eb01e8a33" path="/var/lib/kubelet/pods/ad47d22c-88db-4f84-8239-fd6eb01e8a33/volumes" Apr 16 18:25:35.155581 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.155543 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hq8jf"] Apr 16 18:25:35.156056 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.155810 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad47d22c-88db-4f84-8239-fd6eb01e8a33" containerName="console" Apr 16 18:25:35.156056 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.155820 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad47d22c-88db-4f84-8239-fd6eb01e8a33" containerName="console" Apr 16 18:25:35.156056 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.155864 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad47d22c-88db-4f84-8239-fd6eb01e8a33" containerName="console" Apr 16 18:25:35.158524 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.158497 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.162769 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.162748 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:25:35.182306 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.182275 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hq8jf"] Apr 16 18:25:35.226721 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.226682 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c521005-95e0-47f6-826f-ff6ae998da7e-kubelet-config\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.226721 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.226731 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c521005-95e0-47f6-826f-ff6ae998da7e-original-pull-secret\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.226931 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.226760 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c521005-95e0-47f6-826f-ff6ae998da7e-dbus\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.327504 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.327459 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c521005-95e0-47f6-826f-ff6ae998da7e-original-pull-secret\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.327676 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.327517 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c521005-95e0-47f6-826f-ff6ae998da7e-dbus\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.327676 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.327578 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c521005-95e0-47f6-826f-ff6ae998da7e-kubelet-config\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.327676 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.327657 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c521005-95e0-47f6-826f-ff6ae998da7e-kubelet-config\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.327780 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.327708 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c521005-95e0-47f6-826f-ff6ae998da7e-dbus\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.329774 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.329755 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c521005-95e0-47f6-826f-ff6ae998da7e-original-pull-secret\") pod \"global-pull-secret-syncer-hq8jf\" (UID: \"3c521005-95e0-47f6-826f-ff6ae998da7e\") " pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.467531 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.467435 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hq8jf" Apr 16 18:25:35.588873 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.588840 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hq8jf"] Apr 16 18:25:35.591842 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:25:35.591811 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c521005_95e0_47f6_826f_ff6ae998da7e.slice/crio-5189443386e29956040f1682421834d86bab23e59cd49907c41b8dbbd3e28a5a WatchSource:0}: Error finding container 5189443386e29956040f1682421834d86bab23e59cd49907c41b8dbbd3e28a5a: Status 404 returned error can't find the container with id 5189443386e29956040f1682421834d86bab23e59cd49907c41b8dbbd3e28a5a Apr 16 18:25:35.593552 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:35.593531 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:25:36.084060 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:36.084022 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hq8jf" event={"ID":"3c521005-95e0-47f6-826f-ff6ae998da7e","Type":"ContainerStarted","Data":"5189443386e29956040f1682421834d86bab23e59cd49907c41b8dbbd3e28a5a"} Apr 16 18:25:40.095514 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:40.095434 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hq8jf" event={"ID":"3c521005-95e0-47f6-826f-ff6ae998da7e","Type":"ContainerStarted","Data":"31678dd1248fa6819c55bdf06d11d3d895097cc7f6f9a52916217c05960913af"} Apr 16 18:25:40.112793 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:40.112745 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hq8jf" podStartSLOduration=1.036618029 podStartE2EDuration="5.112730378s" podCreationTimestamp="2026-04-16 18:25:35 +0000 UTC" firstStartedPulling="2026-04-16 18:25:35.593718442 +0000 UTC m=+494.569834645" lastFinishedPulling="2026-04-16 18:25:39.669830801 +0000 UTC m=+498.645946994" observedRunningTime="2026-04-16 18:25:40.111823554 +0000 UTC m=+499.087939768" watchObservedRunningTime="2026-04-16 18:25:40.112730378 +0000 UTC m=+499.088846589" Apr 16 18:25:56.544427 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.544387 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5454bf9f8-rm6cx"] Apr 16 18:25:56.547953 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.547927 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.550588 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.550564 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-8jw5b\"" Apr 16 18:25:56.550694 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.550564 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:25:56.550694 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.550608 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 18:25:56.550694 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.550615 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 18:25:56.551461 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.551446 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:25:56.551634 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.551617 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 18:25:56.551759 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.551691 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 18:25:56.551759 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.551726 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 18:25:56.560536 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.560512 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5454bf9f8-rm6cx"] Apr 16 18:25:56.563510 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.562980 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 18:25:56.704515 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.704472 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-trusted-ca-bundle\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.704711 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.704521 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-service-ca\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.704711 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.704609 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-oauth-serving-cert\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.704711 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.704661 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljm6l\" (UniqueName: \"kubernetes.io/projected/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-kube-api-access-ljm6l\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.704865 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.704711 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-config\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.704865 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.704746 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-oauth-config\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.704865 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.704779 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-serving-cert\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.805431 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.805334 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-service-ca\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.805431 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.805407 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-oauth-serving-cert\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.805621 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.805584 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljm6l\" (UniqueName: \"kubernetes.io/projected/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-kube-api-access-ljm6l\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.805680 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.805637 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-config\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.805680 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.805674 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-oauth-config\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.805781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.805708 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-serving-cert\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.805781 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.805761 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-trusted-ca-bundle\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.806245 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.806212 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-oauth-serving-cert\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.806382 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.806212 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-service-ca\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.806451 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.806376 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-config\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.806516 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.806499 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-trusted-ca-bundle\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.808348 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.808326 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-serving-cert\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.808980 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.808961 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-console-oauth-config\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.815188 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.815165 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljm6l\" (UniqueName: \"kubernetes.io/projected/2152b490-1e93-4fbb-a4d0-b7c5f6461e58-kube-api-access-ljm6l\") pod \"console-5454bf9f8-rm6cx\" (UID: \"2152b490-1e93-4fbb-a4d0-b7c5f6461e58\") " pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.859996 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.859953 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:25:56.997503 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:56.997446 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5454bf9f8-rm6cx"] Apr 16 18:25:57.001497 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:25:57.001469 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2152b490_1e93_4fbb_a4d0_b7c5f6461e58.slice/crio-1d7f622281ea7a9a9da7c4a8c42595767e9b03bc27bac03f3f9f00c857b5cb3c WatchSource:0}: Error finding container 1d7f622281ea7a9a9da7c4a8c42595767e9b03bc27bac03f3f9f00c857b5cb3c: Status 404 returned error can't find the container with id 1d7f622281ea7a9a9da7c4a8c42595767e9b03bc27bac03f3f9f00c857b5cb3c Apr 16 18:25:57.145734 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:57.145699 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5454bf9f8-rm6cx" event={"ID":"2152b490-1e93-4fbb-a4d0-b7c5f6461e58","Type":"ContainerStarted","Data":"b34282e1b87f74c16a85cdd46ccae9f9a5e8b275828adb6f1414eb5ca57e7d55"} Apr 16 18:25:57.145734 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:57.145740 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5454bf9f8-rm6cx" event={"ID":"2152b490-1e93-4fbb-a4d0-b7c5f6461e58","Type":"ContainerStarted","Data":"1d7f622281ea7a9a9da7c4a8c42595767e9b03bc27bac03f3f9f00c857b5cb3c"} Apr 16 18:25:57.170218 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:25:57.170168 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5454bf9f8-rm6cx" podStartSLOduration=1.170148559 podStartE2EDuration="1.170148559s" podCreationTimestamp="2026-04-16 18:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:25:57.169431659 +0000 UTC m=+516.145547872" watchObservedRunningTime="2026-04-16 18:25:57.170148559 +0000 UTC m=+516.146264772" Apr 16 18:26:06.860900 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:06.860811 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:26:06.860900 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:06.860856 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:26:06.865822 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:06.865790 2582 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:26:07.178458 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:07.178431 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5454bf9f8-rm6cx" Apr 16 18:26:16.896912 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.896874 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm"] Apr 16 18:26:16.900918 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.900902 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:16.903909 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.903886 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 18:26:16.904032 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.903886 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 18:26:16.904032 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.903989 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 18:26:16.904356 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.904340 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 18:26:16.904443 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.904392 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 18:26:16.905033 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.905012 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 18:26:16.905123 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.905050 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 18:26:16.919846 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:16.919823 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm"] Apr 16 18:26:17.061282 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.061239 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/81f9a80b-9733-47e2-be90-1a8da45d9add-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.061447 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.061290 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-ca\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.061447 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.061332 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-hub\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.061447 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.061367 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.061447 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.061389 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.061447 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.061422 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xdc2\" (UniqueName: \"kubernetes.io/projected/81f9a80b-9733-47e2-be90-1a8da45d9add-kube-api-access-8xdc2\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.162605 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.162572 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/81f9a80b-9733-47e2-be90-1a8da45d9add-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.162808 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.162626 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-ca\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.162808 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.162668 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-hub\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.162808 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.162707 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.162808 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.162730 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.163012 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.162803 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xdc2\" (UniqueName: \"kubernetes.io/projected/81f9a80b-9733-47e2-be90-1a8da45d9add-kube-api-access-8xdc2\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.163379 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.163356 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/81f9a80b-9733-47e2-be90-1a8da45d9add-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.165368 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.165344 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-hub\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.165480 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.165347 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-ca\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.165685 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.165665 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.165747 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.165666 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/81f9a80b-9733-47e2-be90-1a8da45d9add-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.187165 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.187137 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xdc2\" (UniqueName: \"kubernetes.io/projected/81f9a80b-9733-47e2-be90-1a8da45d9add-kube-api-access-8xdc2\") pod \"cluster-proxy-proxy-agent-69784dd54c-vprmm\" (UID: \"81f9a80b-9733-47e2-be90-1a8da45d9add\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.225479 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.225436 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" Apr 16 18:26:17.367506 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:17.367471 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm"] Apr 16 18:26:17.371562 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:26:17.371538 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f9a80b_9733_47e2_be90_1a8da45d9add.slice/crio-759069dfbb6fa2793f47dd1e9504878229e255dd4afb5c3a235c50c752a4f763 WatchSource:0}: Error finding container 759069dfbb6fa2793f47dd1e9504878229e255dd4afb5c3a235c50c752a4f763: Status 404 returned error can't find the container with id 759069dfbb6fa2793f47dd1e9504878229e255dd4afb5c3a235c50c752a4f763 Apr 16 18:26:18.213662 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:18.213612 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" event={"ID":"81f9a80b-9733-47e2-be90-1a8da45d9add","Type":"ContainerStarted","Data":"759069dfbb6fa2793f47dd1e9504878229e255dd4afb5c3a235c50c752a4f763"} Apr 16 18:26:20.221163 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:20.221123 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" event={"ID":"81f9a80b-9733-47e2-be90-1a8da45d9add","Type":"ContainerStarted","Data":"360bec99ef560f1ee4194bc2a8e41823fad5e638e808047e53b591fbaac63757"} Apr 16 18:26:23.233321 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:23.233279 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" event={"ID":"81f9a80b-9733-47e2-be90-1a8da45d9add","Type":"ContainerStarted","Data":"d47e2506d0a2016b11db0251a867bac4f42f7826deb872606f1c629dcdea323b"} Apr 16 18:26:23.233321 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:23.233324 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" event={"ID":"81f9a80b-9733-47e2-be90-1a8da45d9add","Type":"ContainerStarted","Data":"32c14a87d6566d08d857dac4dc3647f3ddac9dee971cbb103c58fa41bb3f09db"} Apr 16 18:26:23.266346 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:26:23.266297 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-69784dd54c-vprmm" podStartSLOduration=2.359759885 podStartE2EDuration="7.266282619s" podCreationTimestamp="2026-04-16 18:26:16 +0000 UTC" firstStartedPulling="2026-04-16 18:26:17.373191934 +0000 UTC m=+536.349308124" lastFinishedPulling="2026-04-16 18:26:22.279714668 +0000 UTC m=+541.255830858" observedRunningTime="2026-04-16 18:26:23.264449033 +0000 UTC m=+542.240565325" watchObservedRunningTime="2026-04-16 18:26:23.266282619 +0000 UTC m=+542.242398830" Apr 16 18:27:21.538298 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:27:21.538264 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:27:21.539044 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:27:21.539022 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:27:21.542451 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:27:21.542430 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:27:21.543413 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:27:21.543391 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:28:41.405371 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.405331 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-8dznc"] Apr 16 18:28:41.408507 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.408489 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.411056 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.411029 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:28:41.411279 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.411256 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:28:41.411991 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.411971 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 18:28:41.412072 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.411972 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-rzx4z\"" Apr 16 18:28:41.418215 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.418189 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-8dznc"] Apr 16 18:28:41.512390 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.512357 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x4gl\" (UniqueName: \"kubernetes.io/projected/07036ef4-189e-490c-8cd5-9f8ce24304ae-kube-api-access-9x4gl\") pod \"model-serving-api-86f7b4b499-8dznc\" (UID: \"07036ef4-189e-490c-8cd5-9f8ce24304ae\") " pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.512554 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.512419 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07036ef4-189e-490c-8cd5-9f8ce24304ae-tls-certs\") pod \"model-serving-api-86f7b4b499-8dznc\" (UID: \"07036ef4-189e-490c-8cd5-9f8ce24304ae\") " pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.613469 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.613434 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07036ef4-189e-490c-8cd5-9f8ce24304ae-tls-certs\") pod \"model-serving-api-86f7b4b499-8dznc\" (UID: \"07036ef4-189e-490c-8cd5-9f8ce24304ae\") " pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.613670 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.613509 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9x4gl\" (UniqueName: \"kubernetes.io/projected/07036ef4-189e-490c-8cd5-9f8ce24304ae-kube-api-access-9x4gl\") pod \"model-serving-api-86f7b4b499-8dznc\" (UID: \"07036ef4-189e-490c-8cd5-9f8ce24304ae\") " pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.616588 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.616556 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/07036ef4-189e-490c-8cd5-9f8ce24304ae-tls-certs\") pod \"model-serving-api-86f7b4b499-8dznc\" (UID: \"07036ef4-189e-490c-8cd5-9f8ce24304ae\") " pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.623813 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.623784 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x4gl\" (UniqueName: \"kubernetes.io/projected/07036ef4-189e-490c-8cd5-9f8ce24304ae-kube-api-access-9x4gl\") pod \"model-serving-api-86f7b4b499-8dznc\" (UID: \"07036ef4-189e-490c-8cd5-9f8ce24304ae\") " pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.720758 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.720674 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:41.843774 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:41.843746 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-8dznc"] Apr 16 18:28:41.846382 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:28:41.846344 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07036ef4_189e_490c_8cd5_9f8ce24304ae.slice/crio-16e1036a848019b4230cea9695ac54424165fa67d95aa0ea71b2d0bf071db6e0 WatchSource:0}: Error finding container 16e1036a848019b4230cea9695ac54424165fa67d95aa0ea71b2d0bf071db6e0: Status 404 returned error can't find the container with id 16e1036a848019b4230cea9695ac54424165fa67d95aa0ea71b2d0bf071db6e0 Apr 16 18:28:42.632596 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:42.632553 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-8dznc" event={"ID":"07036ef4-189e-490c-8cd5-9f8ce24304ae","Type":"ContainerStarted","Data":"16e1036a848019b4230cea9695ac54424165fa67d95aa0ea71b2d0bf071db6e0"} Apr 16 18:28:44.639836 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:44.639735 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-8dznc" event={"ID":"07036ef4-189e-490c-8cd5-9f8ce24304ae","Type":"ContainerStarted","Data":"7cf040bba8214dd57c2973164b9928fea00b666122dacde88c6b618125b77807"} Apr 16 18:28:44.640313 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:44.639887 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:28:44.658920 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:44.658873 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-8dznc" podStartSLOduration=1.3139433839999999 podStartE2EDuration="3.658861239s" podCreationTimestamp="2026-04-16 18:28:41 +0000 UTC" firstStartedPulling="2026-04-16 18:28:41.848664068 +0000 UTC m=+680.824780262" lastFinishedPulling="2026-04-16 18:28:44.193581918 +0000 UTC m=+683.169698117" observedRunningTime="2026-04-16 18:28:44.657235744 +0000 UTC m=+683.633351999" watchObservedRunningTime="2026-04-16 18:28:44.658861239 +0000 UTC m=+683.634977449" Apr 16 18:28:55.646888 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:28:55.646859 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-8dznc" Apr 16 18:29:42.835336 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.835255 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc"] Apr 16 18:29:42.837648 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.837632 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:42.840222 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.840201 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:29:42.841302 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.841286 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bqs8m\"" Apr 16 18:29:42.841356 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.841304 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 18:29:42.846966 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.846941 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc"] Apr 16 18:29:42.908220 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.908186 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlttb\" (UniqueName: \"kubernetes.io/projected/051cef85-59a2-45a1-847f-6e48502f414f-kube-api-access-nlttb\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:42.908387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.908236 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/051cef85-59a2-45a1-847f-6e48502f414f-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:42.908387 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:42.908324 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/051cef85-59a2-45a1-847f-6e48502f414f-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.009106 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.009057 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlttb\" (UniqueName: \"kubernetes.io/projected/051cef85-59a2-45a1-847f-6e48502f414f-kube-api-access-nlttb\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.009253 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.009155 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/051cef85-59a2-45a1-847f-6e48502f414f-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.009253 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.009204 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/051cef85-59a2-45a1-847f-6e48502f414f-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.009595 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.009571 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/051cef85-59a2-45a1-847f-6e48502f414f-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.011908 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.011885 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/051cef85-59a2-45a1-847f-6e48502f414f-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.018947 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.018921 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlttb\" (UniqueName: \"kubernetes.io/projected/051cef85-59a2-45a1-847f-6e48502f414f-kube-api-access-nlttb\") pod \"seaweedfs-tls-custom-5c88b85bb7-6q4fc\" (UID: \"051cef85-59a2-45a1-847f-6e48502f414f\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.147077 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.146999 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" Apr 16 18:29:43.269234 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.269212 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc"] Apr 16 18:29:43.271901 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:29:43.271874 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod051cef85_59a2_45a1_847f_6e48502f414f.slice/crio-8f4c813de6990868b61fc8ba5ed16556d40f092272607e49473fae2467ae553d WatchSource:0}: Error finding container 8f4c813de6990868b61fc8ba5ed16556d40f092272607e49473fae2467ae553d: Status 404 returned error can't find the container with id 8f4c813de6990868b61fc8ba5ed16556d40f092272607e49473fae2467ae553d Apr 16 18:29:43.814458 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:43.814421 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" event={"ID":"051cef85-59a2-45a1-847f-6e48502f414f","Type":"ContainerStarted","Data":"8f4c813de6990868b61fc8ba5ed16556d40f092272607e49473fae2467ae553d"} Apr 16 18:29:46.824622 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:46.824570 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" event={"ID":"051cef85-59a2-45a1-847f-6e48502f414f","Type":"ContainerStarted","Data":"b12fbceed2db10f8fa723b7c3c85d158a79643b50716bf7f1d6a9d94d0cfaa4f"} Apr 16 18:29:46.846189 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:29:46.846138 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-6q4fc" podStartSLOduration=2.280881101 podStartE2EDuration="4.846124097s" podCreationTimestamp="2026-04-16 18:29:42 +0000 UTC" firstStartedPulling="2026-04-16 18:29:43.273167836 +0000 UTC m=+742.249284028" lastFinishedPulling="2026-04-16 18:29:45.838410824 +0000 UTC m=+744.814527024" observedRunningTime="2026-04-16 18:29:46.845832554 +0000 UTC m=+745.821948764" watchObservedRunningTime="2026-04-16 18:29:46.846124097 +0000 UTC m=+745.822240308" Apr 16 18:32:21.563542 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:32:21.563460 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:32:21.564058 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:32:21.563726 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:32:21.568158 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:32:21.568131 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:32:21.568334 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:32:21.568320 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:33:22.502209 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:22.502173 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9"] Apr 16 18:33:22.504602 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:22.504585 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" Apr 16 18:33:22.507183 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:22.507163 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tdsxq\"" Apr 16 18:33:22.514582 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:22.514558 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" Apr 16 18:33:22.515933 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:22.515912 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9"] Apr 16 18:33:22.644413 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:22.644381 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9"] Apr 16 18:33:22.647417 ip-10-0-139-117 kubenswrapper[2582]: W0416 18:33:22.647384 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c33d60f_bfc4_4206_9689_00b3ae061837.slice/crio-a295d53bc8ec9c726bb25ea9ade555312314fbdcec7181f7c347e85368002ca5 WatchSource:0}: Error finding container a295d53bc8ec9c726bb25ea9ade555312314fbdcec7181f7c347e85368002ca5: Status 404 returned error can't find the container with id a295d53bc8ec9c726bb25ea9ade555312314fbdcec7181f7c347e85368002ca5 Apr 16 18:33:22.649318 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:22.649300 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:33:23.451855 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:23.451816 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" event={"ID":"3c33d60f-bfc4-4206-9689-00b3ae061837","Type":"ContainerStarted","Data":"a295d53bc8ec9c726bb25ea9ade555312314fbdcec7181f7c347e85368002ca5"} Apr 16 18:33:24.456945 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:24.456911 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" event={"ID":"3c33d60f-bfc4-4206-9689-00b3ae061837","Type":"ContainerStarted","Data":"cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f"} Apr 16 18:33:24.457386 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:24.457210 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" Apr 16 18:33:24.459214 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:24.459192 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" Apr 16 18:33:24.475637 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:33:24.475571 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" podStartSLOduration=1.558283566 podStartE2EDuration="2.475551705s" podCreationTimestamp="2026-04-16 18:33:22 +0000 UTC" firstStartedPulling="2026-04-16 18:33:22.649429482 +0000 UTC m=+961.625545671" lastFinishedPulling="2026-04-16 18:33:23.566697617 +0000 UTC m=+962.542813810" observedRunningTime="2026-04-16 18:33:24.475019761 +0000 UTC m=+963.451135973" watchObservedRunningTime="2026-04-16 18:33:24.475551705 +0000 UTC m=+963.451667917" Apr 16 18:34:57.594369 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:57.594331 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-8zjz9_3c33d60f-bfc4-4206-9689-00b3ae061837/kserve-container/0.log" Apr 16 18:34:57.926874 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:57.926837 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9"] Apr 16 18:34:57.927136 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:57.927110 2582 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" podUID="3c33d60f-bfc4-4206-9689-00b3ae061837" containerName="kserve-container" containerID="cri-o://cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f" gracePeriod=30 Apr 16 18:34:58.164159 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.164134 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" Apr 16 18:34:58.734264 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.734227 2582 generic.go:358] "Generic (PLEG): container finished" podID="3c33d60f-bfc4-4206-9689-00b3ae061837" containerID="cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f" exitCode=2 Apr 16 18:34:58.734684 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.734294 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" event={"ID":"3c33d60f-bfc4-4206-9689-00b3ae061837","Type":"ContainerDied","Data":"cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f"} Apr 16 18:34:58.734684 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.734307 2582 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" Apr 16 18:34:58.734684 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.734327 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9" event={"ID":"3c33d60f-bfc4-4206-9689-00b3ae061837","Type":"ContainerDied","Data":"a295d53bc8ec9c726bb25ea9ade555312314fbdcec7181f7c347e85368002ca5"} Apr 16 18:34:58.734684 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.734344 2582 scope.go:117] "RemoveContainer" containerID="cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f" Apr 16 18:34:58.742631 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.742613 2582 scope.go:117] "RemoveContainer" containerID="cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f" Apr 16 18:34:58.742998 ip-10-0-139-117 kubenswrapper[2582]: E0416 18:34:58.742974 2582 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f\": container with ID starting with cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f not found: ID does not exist" containerID="cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f" Apr 16 18:34:58.743058 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.743012 2582 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f"} err="failed to get container status \"cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f\": rpc error: code = NotFound desc = could not find container \"cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f\": container with ID starting with cf9bb1734f0601d42535e274dab60ae17ae4b6f242b9fd19012f58d976128d7f not found: ID does not exist" Apr 16 18:34:58.757181 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.757153 2582 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9"] Apr 16 18:34:58.762161 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:58.762136 2582 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-8zjz9"] Apr 16 18:34:59.618868 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:34:59.618821 2582 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c33d60f-bfc4-4206-9689-00b3ae061837" path="/var/lib/kubelet/pods/3c33d60f-bfc4-4206-9689-00b3ae061837/volumes" Apr 16 18:37:21.585273 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:37:21.585239 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:37:21.586022 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:37:21.586002 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:37:21.589654 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:37:21.589633 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:37:21.590337 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:37:21.590315 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:42:21.606511 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:42:21.606480 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:42:21.607508 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:42:21.607486 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:42:21.611142 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:42:21.611119 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:42:21.611793 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:42:21.611769 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:47:21.629858 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:47:21.629828 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:47:21.632362 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:47:21.632335 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:47:21.634141 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:47:21.634123 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:47:21.636824 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:47:21.636803 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:52:21.651887 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:52:21.651209 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:52:21.657460 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:52:21.657431 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:52:21.659596 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:52:21.659575 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:52:21.662025 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:52:21.662009 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:57:21.675944 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:57:21.675913 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:57:21.678367 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:57:21.678342 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 18:57:21.679934 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:57:21.679902 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 18:57:21.682548 ip-10-0-139-117 kubenswrapper[2582]: I0416 18:57:21.682521 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:02:21.697142 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:02:21.697047 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:02:21.699968 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:02:21.699942 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:02:21.701261 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:02:21.701242 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:02:21.703995 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:02:21.703975 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:07:21.721033 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:07:21.721002 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:07:21.724503 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:07:21.724478 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:07:21.725691 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:07:21.725673 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:07:21.728755 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:07:21.728738 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:12:21.748206 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:12:21.748179 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:12:21.750706 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:12:21.750683 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:12:21.752419 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:12:21.752399 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:12:21.754481 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:12:21.754463 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:17:21.774146 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:17:21.774116 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:17:21.776451 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:17:21.776428 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:17:21.778088 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:17:21.778068 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:17:21.780502 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:17:21.780481 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:22:21.794151 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:22:21.794116 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:22:21.803237 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:22:21.803206 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:22:21.803728 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:22:21.803709 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:22:21.807710 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:22:21.807691 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:24:24.255708 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.255649 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7m7f8/must-gather-v659b"] Apr 16 19:24:24.256194 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.255972 2582 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c33d60f-bfc4-4206-9689-00b3ae061837" containerName="kserve-container" Apr 16 19:24:24.256194 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.255983 2582 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c33d60f-bfc4-4206-9689-00b3ae061837" containerName="kserve-container" Apr 16 19:24:24.256194 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.256046 2582 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c33d60f-bfc4-4206-9689-00b3ae061837" containerName="kserve-container" Apr 16 19:24:24.259028 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.259009 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.262041 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.262021 2582 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7m7f8\"/\"default-dockercfg-jlsz5\"" Apr 16 19:24:24.262218 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.262202 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7m7f8\"/\"openshift-service-ca.crt\"" Apr 16 19:24:24.263023 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.263008 2582 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7m7f8\"/\"kube-root-ca.crt\"" Apr 16 19:24:24.269086 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.269064 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/must-gather-v659b"] Apr 16 19:24:24.345286 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.345243 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e42d9412-0865-45c1-ab71-52c7b079ebe2-must-gather-output\") pod \"must-gather-v659b\" (UID: \"e42d9412-0865-45c1-ab71-52c7b079ebe2\") " pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.345286 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.345287 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4mgx\" (UniqueName: \"kubernetes.io/projected/e42d9412-0865-45c1-ab71-52c7b079ebe2-kube-api-access-t4mgx\") pod \"must-gather-v659b\" (UID: \"e42d9412-0865-45c1-ab71-52c7b079ebe2\") " pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.446596 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.446553 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e42d9412-0865-45c1-ab71-52c7b079ebe2-must-gather-output\") pod \"must-gather-v659b\" (UID: \"e42d9412-0865-45c1-ab71-52c7b079ebe2\") " pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.446596 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.446595 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4mgx\" (UniqueName: \"kubernetes.io/projected/e42d9412-0865-45c1-ab71-52c7b079ebe2-kube-api-access-t4mgx\") pod \"must-gather-v659b\" (UID: \"e42d9412-0865-45c1-ab71-52c7b079ebe2\") " pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.446927 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.446907 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e42d9412-0865-45c1-ab71-52c7b079ebe2-must-gather-output\") pod \"must-gather-v659b\" (UID: \"e42d9412-0865-45c1-ab71-52c7b079ebe2\") " pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.456293 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.456256 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4mgx\" (UniqueName: \"kubernetes.io/projected/e42d9412-0865-45c1-ab71-52c7b079ebe2-kube-api-access-t4mgx\") pod \"must-gather-v659b\" (UID: \"e42d9412-0865-45c1-ab71-52c7b079ebe2\") " pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.568618 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.568511 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/must-gather-v659b" Apr 16 19:24:24.696314 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.696278 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/must-gather-v659b"] Apr 16 19:24:24.699176 ip-10-0-139-117 kubenswrapper[2582]: W0416 19:24:24.699143 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42d9412_0865_45c1_ab71_52c7b079ebe2.slice/crio-9c1e34545d19dfc7d96d310e1a75f7b083ebc3ef8c833e1025bb742841f4972f WatchSource:0}: Error finding container 9c1e34545d19dfc7d96d310e1a75f7b083ebc3ef8c833e1025bb742841f4972f: Status 404 returned error can't find the container with id 9c1e34545d19dfc7d96d310e1a75f7b083ebc3ef8c833e1025bb742841f4972f Apr 16 19:24:24.700912 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:24.700884 2582 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:24:25.254909 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:25.254863 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/must-gather-v659b" event={"ID":"e42d9412-0865-45c1-ab71-52c7b079ebe2","Type":"ContainerStarted","Data":"9c1e34545d19dfc7d96d310e1a75f7b083ebc3ef8c833e1025bb742841f4972f"} Apr 16 19:24:26.260510 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:26.260462 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/must-gather-v659b" event={"ID":"e42d9412-0865-45c1-ab71-52c7b079ebe2","Type":"ContainerStarted","Data":"d7707f3ec330830f8d395979e02b7061b4562e9b64a4f1f5af6217999506118c"} Apr 16 19:24:26.260987 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:26.260517 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/must-gather-v659b" event={"ID":"e42d9412-0865-45c1-ab71-52c7b079ebe2","Type":"ContainerStarted","Data":"50a30d5e1d7f67e78301f14bb6bfbe91d9a1f2e22cf48b33cd6a7ce5b88f08cc"} Apr 16 19:24:26.280832 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:26.280770 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7m7f8/must-gather-v659b" podStartSLOduration=1.490802475 podStartE2EDuration="2.280754472s" podCreationTimestamp="2026-04-16 19:24:24 +0000 UTC" firstStartedPulling="2026-04-16 19:24:24.701005955 +0000 UTC m=+4023.677122144" lastFinishedPulling="2026-04-16 19:24:25.490957938 +0000 UTC m=+4024.467074141" observedRunningTime="2026-04-16 19:24:26.278967286 +0000 UTC m=+4025.255083498" watchObservedRunningTime="2026-04-16 19:24:26.280754472 +0000 UTC m=+4025.256870720" Apr 16 19:24:27.025377 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:27.025342 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hq8jf_3c521005-95e0-47f6-826f-ff6ae998da7e/global-pull-secret-syncer/0.log" Apr 16 19:24:27.254377 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:27.254345 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7t997_3bf7ebd6-29a1-4bfa-957a-f30502b557e5/konnectivity-agent/0.log" Apr 16 19:24:27.377067 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:27.376988 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-117.ec2.internal_4c9d1315a36f55d6d5f62663bd6ecf76/haproxy/0.log" Apr 16 19:24:30.427410 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:30.427379 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-9b99n_fb59922a-7f8a-4dfe-aeec-b44fcb1ba20b/cluster-monitoring-operator/0.log" Apr 16 19:24:30.554645 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:30.554615 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-555fc6689c-q47b2_bfc557f8-02ab-46e3-8b68-0db64836c12b/metrics-server/0.log" Apr 16 19:24:30.588547 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:30.588522 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-t857x_c23ee750-8c36-4728-bd70-7a3ad044554d/monitoring-plugin/0.log" Apr 16 19:24:30.832719 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:30.832620 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lp8sq_84b72a6f-e66f-473d-98f9-2c24e5660d4d/node-exporter/0.log" Apr 16 19:24:30.861808 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:30.861779 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lp8sq_84b72a6f-e66f-473d-98f9-2c24e5660d4d/kube-rbac-proxy/0.log" Apr 16 19:24:30.893766 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:30.893735 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-lp8sq_84b72a6f-e66f-473d-98f9-2c24e5660d4d/init-textfile/0.log" Apr 16 19:24:33.365696 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:33.365660 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/2.log" Apr 16 19:24:33.370558 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:33.370521 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-kmx4l_16f4257e-9483-4d05-bec2-a89b52ff2015/console-operator/3.log" Apr 16 19:24:33.769626 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:33.769598 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5454bf9f8-rm6cx_2152b490-1e93-4fbb-a4d0-b7c5f6461e58/console/0.log" Apr 16 19:24:34.236978 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.236937 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-flgw2_ff19de9b-cc93-426d-9316-5a00b8359309/volume-data-source-validator/0.log" Apr 16 19:24:34.461549 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.461515 2582 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52"] Apr 16 19:24:34.465652 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.465626 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.477068 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.477043 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52"] Apr 16 19:24:34.640785 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.640695 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-sys\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.640928 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.640781 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-lib-modules\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.640928 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.640811 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-podres\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.640928 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.640904 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-proc\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.641054 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.640943 2582 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6fq\" (UniqueName: \"kubernetes.io/projected/5d02bc92-dafd-4afe-a030-4b52a71f8866-kube-api-access-2q6fq\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741415 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741374 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-lib-modules\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741415 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741415 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-podres\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741663 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741485 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-proc\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741663 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741522 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6fq\" (UniqueName: \"kubernetes.io/projected/5d02bc92-dafd-4afe-a030-4b52a71f8866-kube-api-access-2q6fq\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741663 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741565 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-lib-modules\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741663 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741589 2582 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-sys\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741663 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741614 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-podres\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741663 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741631 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-sys\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.741663 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.741592 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5d02bc92-dafd-4afe-a030-4b52a71f8866-proc\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.752055 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.752022 2582 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6fq\" (UniqueName: \"kubernetes.io/projected/5d02bc92-dafd-4afe-a030-4b52a71f8866-kube-api-access-2q6fq\") pod \"perf-node-gather-daemonset-rjg52\" (UID: \"5d02bc92-dafd-4afe-a030-4b52a71f8866\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.778143 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.778108 2582 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:34.923907 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:34.923849 2582 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52"] Apr 16 19:24:34.929510 ip-10-0-139-117 kubenswrapper[2582]: W0416 19:24:34.929481 2582 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5d02bc92_dafd_4afe_a030_4b52a71f8866.slice/crio-924b4eaa9c387ed28a13cac83d14827b801a7a67306321bf4ce16645c6b795e7 WatchSource:0}: Error finding container 924b4eaa9c387ed28a13cac83d14827b801a7a67306321bf4ce16645c6b795e7: Status 404 returned error can't find the container with id 924b4eaa9c387ed28a13cac83d14827b801a7a67306321bf4ce16645c6b795e7 Apr 16 19:24:35.151847 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.151813 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-js745_2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1/dns/0.log" Apr 16 19:24:35.181568 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.181494 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-js745_2ea7a9c5-06e5-43bc-86f8-4ae8371bb1d1/kube-rbac-proxy/0.log" Apr 16 19:24:35.243852 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.243822 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-cj779_f68745e4-3c2b-4cbe-80a9-80320d887584/dns-node-resolver/0.log" Apr 16 19:24:35.293164 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.293130 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" event={"ID":"5d02bc92-dafd-4afe-a030-4b52a71f8866","Type":"ContainerStarted","Data":"e1dab9a173564541f4a566649503052d627cfb7feea56f568781f2a18c342b36"} Apr 16 19:24:35.293164 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.293168 2582 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" event={"ID":"5d02bc92-dafd-4afe-a030-4b52a71f8866","Type":"ContainerStarted","Data":"924b4eaa9c387ed28a13cac83d14827b801a7a67306321bf4ce16645c6b795e7"} Apr 16 19:24:35.293414 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.293218 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:35.313318 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.313231 2582 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" podStartSLOduration=1.313210684 podStartE2EDuration="1.313210684s" podCreationTimestamp="2026-04-16 19:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:24:35.312440153 +0000 UTC m=+4034.288556366" watchObservedRunningTime="2026-04-16 19:24:35.313210684 +0000 UTC m=+4034.289326896" Apr 16 19:24:35.823721 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:35.823696 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-n229m_bead58a1-c7d1-4221-8dba-7355ad1eee28/node-ca/0.log" Apr 16 19:24:37.165576 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:37.165545 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vf74w_0362b269-7b97-4579-a5a1-f882325a361a/serve-healthcheck-canary/0.log" Apr 16 19:24:37.690653 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:37.690615 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qx2kf_759c41ce-9647-447d-83db-cedf1e89428e/kube-rbac-proxy/0.log" Apr 16 19:24:37.719463 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:37.719434 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qx2kf_759c41ce-9647-447d-83db-cedf1e89428e/exporter/0.log" Apr 16 19:24:37.748868 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:37.748840 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qx2kf_759c41ce-9647-447d-83db-cedf1e89428e/extractor/0.log" Apr 16 19:24:40.051515 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:40.051487 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-8dznc_07036ef4-189e-490c-8cd5-9f8ce24304ae/server/0.log" Apr 16 19:24:40.642484 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:40.642457 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-6q4fc_051cef85-59a2-45a1-847f-6e48502f414f/seaweedfs-tls-custom/0.log" Apr 16 19:24:41.309176 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:41.309143 2582 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-rjg52" Apr 16 19:24:45.365357 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:45.365327 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-bf6n2_01d4cf30-f9cf-4378-9264-d1541f508a34/migrator/0.log" Apr 16 19:24:45.393024 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:45.392991 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-bf6n2_01d4cf30-f9cf-4378-9264-d1541f508a34/graceful-termination/0.log" Apr 16 19:24:45.793548 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:45.793510 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-m887h_ead2ae15-459e-4b99-898d-ae36578d9ffa/kube-storage-version-migrator-operator/1.log" Apr 16 19:24:45.794801 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:45.794769 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-m887h_ead2ae15-459e-4b99-898d-ae36578d9ffa/kube-storage-version-migrator-operator/0.log" Apr 16 19:24:46.807550 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:46.807519 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-4mngt_12cb712b-2a1d-4af9-a5dc-79912365f003/kube-multus/0.log" Apr 16 19:24:47.262229 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.262202 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hpm7j_c19385c2-b1c2-45bc-a50b-91342bfe5265/kube-multus-additional-cni-plugins/0.log" Apr 16 19:24:47.303772 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.303732 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hpm7j_c19385c2-b1c2-45bc-a50b-91342bfe5265/egress-router-binary-copy/0.log" Apr 16 19:24:47.331080 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.331049 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hpm7j_c19385c2-b1c2-45bc-a50b-91342bfe5265/cni-plugins/0.log" Apr 16 19:24:47.356942 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.356909 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hpm7j_c19385c2-b1c2-45bc-a50b-91342bfe5265/bond-cni-plugin/0.log" Apr 16 19:24:47.384854 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.384826 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hpm7j_c19385c2-b1c2-45bc-a50b-91342bfe5265/routeoverride-cni/0.log" Apr 16 19:24:47.412054 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.412024 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hpm7j_c19385c2-b1c2-45bc-a50b-91342bfe5265/whereabouts-cni-bincopy/0.log" Apr 16 19:24:47.441678 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.441643 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hpm7j_c19385c2-b1c2-45bc-a50b-91342bfe5265/whereabouts-cni/0.log" Apr 16 19:24:47.553529 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.553453 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4zpgf_847e2695-c897-4ed9-95c4-10d0fbef9e09/network-metrics-daemon/0.log" Apr 16 19:24:47.578579 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:47.578545 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4zpgf_847e2695-c897-4ed9-95c4-10d0fbef9e09/kube-rbac-proxy/0.log" Apr 16 19:24:48.505442 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.505412 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-controller/0.log" Apr 16 19:24:48.538953 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.538924 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/0.log" Apr 16 19:24:48.560501 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.560463 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovn-acl-logging/1.log" Apr 16 19:24:48.586896 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.586871 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/kube-rbac-proxy-node/0.log" Apr 16 19:24:48.612259 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.612232 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:24:48.639461 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.639430 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/northd/0.log" Apr 16 19:24:48.669614 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.669583 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/nbdb/0.log" Apr 16 19:24:48.701739 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.701714 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/sbdb/0.log" Apr 16 19:24:48.823450 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:48.823361 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2frx4_74787fd3-6aff-45fa-b4f4-4f97b01f0899/ovnkube-controller/0.log" Apr 16 19:24:50.864862 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:50.864835 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-hzrtb_15d17920-d56e-4995-bbc2-c4b5a72e3162/check-endpoints/0.log" Apr 16 19:24:50.925901 ip-10-0-139-117 kubenswrapper[2582]: I0416 19:24:50.925868 2582 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-crg6m_439470b0-687a-4bea-ad03-3eebe6cb41cd/network-check-target-container/0.log"