Apr 22 18:17:58.527321 ip-10-0-143-95 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:17:58.527330 ip-10-0-143-95 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:17:58.527338 ip-10-0-143-95 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:17:58.527489 ip-10-0-143-95 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:18:08.572239 ip-10-0-143-95 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:18:08.572253 ip-10-0-143-95 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 17503638c6824e5aa5ed1e3854e066f9 -- Apr 22 18:20:28.278864 ip-10-0-143-95 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:20:28.665957 ip-10-0-143-95 kubenswrapper[2561]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:28.665957 ip-10-0-143-95 kubenswrapper[2561]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:20:28.665957 ip-10-0-143-95 kubenswrapper[2561]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:28.665957 ip-10-0-143-95 kubenswrapper[2561]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:20:28.665957 ip-10-0-143-95 kubenswrapper[2561]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:20:28.667465 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.667374 2561 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:20:28.669638 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669622 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:28.669638 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669639 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669644 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669648 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669651 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669654 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669657 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669660 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669663 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669665 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669668 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669670 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669673 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669676 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669678 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669681 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669684 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669692 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669694 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669697 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669700 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:28.669706 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669702 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669705 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669707 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669710 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669713 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669716 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669719 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669722 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669725 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669727 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669730 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669732 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669735 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669745 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669748 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669750 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669753 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669755 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669758 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669761 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:28.670213 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669763 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669765 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669768 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669771 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669773 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669776 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669778 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669781 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669785 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669787 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669790 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669793 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669795 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669798 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669801 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669804 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669807 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669810 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669814 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669818 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:28.670714 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669820 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669823 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669825 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669828 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669833 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669836 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669839 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669842 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669845 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669847 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669850 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669853 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669855 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669858 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669860 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669863 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669866 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669868 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669871 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:28.671194 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669873 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669876 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669879 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669882 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669884 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.669886 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670279 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670286 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670289 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670292 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670296 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670299 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670302 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670304 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670307 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670310 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670312 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670315 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670317 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670320 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:28.671666 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670323 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670325 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670328 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670331 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670334 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670336 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670339 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670342 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670344 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670347 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670350 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670354 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670358 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670361 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670364 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670367 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670369 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670372 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670375 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670378 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:28.672145 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670381 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670385 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670389 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670392 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670395 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670398 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670400 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670403 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670406 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670408 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670411 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670414 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670416 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670419 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670422 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670424 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670427 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670430 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670432 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:28.672655 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670435 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670439 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670441 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670444 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670447 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670449 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670452 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670454 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670457 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670459 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670462 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670465 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670467 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670470 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670472 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670475 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670478 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670480 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670483 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670485 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:28.673153 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670488 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670490 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670493 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670496 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670498 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670501 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670504 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670506 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670521 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670524 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670526 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670529 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.670532 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671139 2561 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671154 2561 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671163 2561 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671168 2561 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671172 2561 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671176 2561 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671181 2561 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671185 2561 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:20:28.673699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671188 2561 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671192 2561 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671195 2561 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671199 2561 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671202 2561 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671205 2561 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671209 2561 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671212 2561 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671215 2561 flags.go:64] FLAG: --cloud-config="" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671218 2561 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671221 2561 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671226 2561 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671229 2561 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671232 2561 flags.go:64] FLAG: --config-dir="" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671235 2561 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671238 2561 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671243 2561 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671246 2561 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671249 2561 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671252 2561 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671256 2561 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671259 2561 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671262 2561 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671267 2561 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671270 2561 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:20:28.674202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671274 2561 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671277 2561 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671280 2561 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671283 2561 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671286 2561 flags.go:64] FLAG: --enable-server="true" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671289 2561 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671295 2561 flags.go:64] FLAG: --event-burst="100" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671298 2561 flags.go:64] FLAG: --event-qps="50" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671301 2561 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671305 2561 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671309 2561 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671313 2561 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671316 2561 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671319 2561 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671322 2561 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671325 2561 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671328 2561 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671331 2561 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671334 2561 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671337 2561 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671340 2561 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671343 2561 flags.go:64] FLAG: --feature-gates="" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671347 2561 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671350 2561 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671353 2561 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:20:28.674828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671357 2561 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671360 2561 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671363 2561 flags.go:64] FLAG: --help="false" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671366 2561 flags.go:64] FLAG: --hostname-override="ip-10-0-143-95.ec2.internal" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671369 2561 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671376 2561 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671380 2561 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671383 2561 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671387 2561 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671390 2561 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671393 2561 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671396 2561 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671399 2561 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671402 2561 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671405 2561 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671408 2561 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671411 2561 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671415 2561 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671418 2561 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671422 2561 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671424 2561 flags.go:64] FLAG: --lock-file="" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671428 2561 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671431 2561 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671434 2561 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:20:28.675418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671439 2561 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671442 2561 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671445 2561 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671448 2561 flags.go:64] FLAG: --logging-format="text" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671451 2561 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671454 2561 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671457 2561 flags.go:64] FLAG: --manifest-url="" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671460 2561 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671465 2561 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671468 2561 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671472 2561 flags.go:64] FLAG: --max-pods="110" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671475 2561 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671478 2561 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671483 2561 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671486 2561 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671490 2561 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671493 2561 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671496 2561 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671503 2561 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671506 2561 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671524 2561 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671529 2561 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671534 2561 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:20:28.676017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671540 2561 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671543 2561 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671548 2561 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671551 2561 flags.go:64] FLAG: --port="10250" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671554 2561 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671557 2561 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b957f02631380309" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671560 2561 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671564 2561 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671567 2561 flags.go:64] FLAG: --register-node="true" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671570 2561 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671573 2561 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671576 2561 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671579 2561 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671582 2561 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671585 2561 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671589 2561 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671592 2561 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671595 2561 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671598 2561 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671601 2561 flags.go:64] FLAG: --runonce="false" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671603 2561 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671607 2561 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671610 2561 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671614 2561 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671617 2561 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671621 2561 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:20:28.676590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671624 2561 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671627 2561 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671630 2561 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671633 2561 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671636 2561 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671639 2561 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671642 2561 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671645 2561 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671652 2561 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671657 2561 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671660 2561 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671663 2561 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671667 2561 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671670 2561 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671673 2561 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671677 2561 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671680 2561 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671683 2561 flags.go:64] FLAG: --v="2" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671687 2561 flags.go:64] FLAG: --version="false" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671694 2561 flags.go:64] FLAG: --vmodule="" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671699 2561 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.671702 2561 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671799 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671803 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:28.677222 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671806 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671809 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671812 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671815 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671818 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671823 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671826 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671829 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671834 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671838 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671841 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671844 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671847 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671849 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671852 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671854 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671858 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671861 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671864 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:28.677822 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671866 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671869 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671872 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671874 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671877 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671880 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671882 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671885 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671888 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671890 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671893 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671895 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671898 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671900 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671903 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671906 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671908 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671911 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671915 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671918 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:28.678315 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671920 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671923 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671926 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671928 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671931 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671933 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671936 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671938 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671941 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671945 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671948 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671950 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671953 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671956 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671958 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671961 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671963 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671966 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671968 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671971 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:28.678829 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671973 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671976 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671979 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671981 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671984 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671986 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671989 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671991 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671994 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.671997 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672001 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672003 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672013 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672017 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672020 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672027 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672031 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672033 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672036 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672039 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:28.679313 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672041 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672046 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672048 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672051 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.672054 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.672855 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.678908 2561 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.678923 2561 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678971 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678976 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678979 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678982 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678986 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678988 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678991 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678994 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:28.679804 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678996 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.678999 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679002 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679005 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679007 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679010 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679013 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679015 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679018 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679021 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679024 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679026 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679029 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679032 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679034 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679037 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679039 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679042 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679044 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679047 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:28.680224 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679049 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679052 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679054 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679059 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679062 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679065 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679068 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679070 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679073 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679077 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679082 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679085 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679088 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679091 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679094 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679097 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679100 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679102 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679105 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:28.680772 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679107 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679110 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679113 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679115 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679118 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679120 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679123 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679126 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679128 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679130 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679133 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679135 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679138 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679140 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679143 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679146 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679149 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679152 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679155 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679157 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:28.681291 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679160 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679162 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679165 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679167 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679170 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679172 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679175 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679178 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679180 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679183 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679186 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679190 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679193 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679196 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679199 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679201 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679204 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679206 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:28.681800 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679209 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.679214 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679321 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679327 2561 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679330 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679333 2561 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679336 2561 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679338 2561 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679341 2561 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679343 2561 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679346 2561 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679349 2561 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679352 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679355 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679358 2561 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679360 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:20:28.682251 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679363 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679365 2561 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679368 2561 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679371 2561 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679374 2561 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679376 2561 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679379 2561 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679381 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679384 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679386 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679389 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679391 2561 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679394 2561 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679398 2561 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679402 2561 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679405 2561 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679407 2561 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679410 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679413 2561 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679416 2561 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:20:28.682671 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679419 2561 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679421 2561 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679424 2561 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679426 2561 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679429 2561 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679431 2561 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679434 2561 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679437 2561 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679439 2561 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679449 2561 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679451 2561 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679454 2561 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679457 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679459 2561 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679462 2561 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679465 2561 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679468 2561 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679470 2561 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679473 2561 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679475 2561 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:20:28.683189 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679478 2561 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679480 2561 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679483 2561 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679485 2561 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679488 2561 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679490 2561 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679493 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679495 2561 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679498 2561 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679501 2561 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679503 2561 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679506 2561 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679525 2561 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679528 2561 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679531 2561 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679533 2561 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679536 2561 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679539 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679541 2561 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:20:28.683713 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679544 2561 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679547 2561 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679549 2561 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679552 2561 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679555 2561 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679558 2561 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679560 2561 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679563 2561 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679565 2561 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679568 2561 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679570 2561 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679574 2561 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:28.679578 2561 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.679582 2561 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.680212 2561 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:20:28.684185 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.682438 2561 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:20:28.684569 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.683238 2561 server.go:1019] "Starting client certificate rotation" Apr 22 18:20:28.684569 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.683327 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:28.684569 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.684089 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:20:28.703672 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.703651 2561 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:28.709071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.709051 2561 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:20:28.723285 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.723264 2561 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:20:28.728903 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.728879 2561 log.go:25] "Validated CRI v1 image API" Apr 22 18:20:28.729999 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.729979 2561 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:20:28.734592 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.734567 2561 fs.go:135] Filesystem UUIDs: map[27081e25-11b3-46d6-8578-bb699a997f30:/dev/nvme0n1p3 2f9d9342-8c47-4f76-8558-50b0bce4d40f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:20:28.734664 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.734592 2561 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:20:28.740293 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.740151 2561 manager.go:217] Machine: {Timestamp:2026-04-22 18:20:28.738364421 +0000 UTC m=+0.355005559 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3076376 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21a80717540b9b75b1cb069e9fb6e6 SystemUUID:ec21a807-1754-0b9b-75b1-cb069e9fb6e6 BootID:17503638-c682-4e5a-a5ed-1e3854e066f9 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2f:9c:0c:b7:91 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2f:9c:0c:b7:91 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:ff:c6:82:e8:24 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:20:28.740293 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.740284 2561 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:20:28.740429 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.740388 2561 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:20:28.741276 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.741250 2561 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:20:28.741417 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.741279 2561 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-143-95.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:20:28.741460 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.741426 2561 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:20:28.741460 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.741434 2561 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:20:28.741460 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.741447 2561 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:28.742087 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.742076 2561 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:20:28.742854 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.742845 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:28.742975 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.742966 2561 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:20:28.743542 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.743526 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:28.744936 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.744925 2561 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:20:28.744977 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.744941 2561 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:20:28.744977 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.744954 2561 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:20:28.744977 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.744963 2561 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:20:28.744977 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.744976 2561 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:20:28.745848 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.745835 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:28.745899 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.745854 2561 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:20:28.748200 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.748177 2561 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:20:28.749337 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.749324 2561 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:20:28.750765 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750754 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:20:28.750803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750772 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:20:28.750803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750778 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:20:28.750803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750784 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:20:28.750803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750790 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:20:28.750803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750796 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:20:28.750803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750803 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:20:28.750959 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750809 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:20:28.750959 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750816 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:20:28.750959 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750822 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:20:28.750959 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750831 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:20:28.750959 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.750840 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:20:28.751634 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.751625 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:20:28.751634 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.751634 2561 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:20:28.755951 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.755931 2561 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:20:28.756057 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.755978 2561 server.go:1295] "Started kubelet" Apr 22 18:20:28.756130 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.756055 2561 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:20:28.756472 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.756088 2561 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:20:28.756590 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.756524 2561 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:20:28.756948 ip-10-0-143-95 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:20:28.758482 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.758459 2561 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:20:28.758683 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.758668 2561 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:20:28.759545 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.759499 2561 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-143-95.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:20:28.759660 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.759562 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-143-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:20:28.759660 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.759574 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:20:28.762094 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.761329 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-95.ec2.internal.18a8c0ca73c61033 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-95.ec2.internal,UID:ip-10-0-143-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-143-95.ec2.internal,},FirstTimestamp:2026-04-22 18:20:28.755947571 +0000 UTC m=+0.372588692,LastTimestamp:2026-04-22 18:20:28.755947571 +0000 UTC m=+0.372588692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-95.ec2.internal,}" Apr 22 18:20:28.764032 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.764004 2561 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:28.764487 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.764474 2561 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:20:28.765049 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765022 2561 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:20:28.765049 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765029 2561 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:20:28.765184 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765058 2561 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:20:28.765184 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765165 2561 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:20:28.765184 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765176 2561 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:20:28.765351 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.765222 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:28.765351 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765276 2561 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:20:28.765351 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765290 2561 factory.go:55] Registering systemd factory Apr 22 18:20:28.765351 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765300 2561 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:20:28.765579 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765501 2561 factory.go:153] Registering CRI-O factory Apr 22 18:20:28.765579 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765530 2561 factory.go:223] Registration of the crio container factory successfully Apr 22 18:20:28.765579 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765555 2561 factory.go:103] Registering Raw factory Apr 22 18:20:28.765579 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.765569 2561 manager.go:1196] Started watching for new ooms in manager Apr 22 18:20:28.766200 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.766181 2561 manager.go:319] Starting recovery of all containers Apr 22 18:20:28.768042 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.768016 2561 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:20:28.768152 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.768125 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:20:28.768301 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.768265 2561 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-143-95.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:20:28.776935 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.776684 2561 manager.go:324] Recovery completed Apr 22 18:20:28.782017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.782003 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:28.784207 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.784186 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:28.784272 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.784221 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:28.784272 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.784231 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:28.784743 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.784727 2561 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:20:28.784810 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.784744 2561 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:20:28.784810 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.784767 2561 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:20:28.787756 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.787744 2561 policy_none.go:49] "None policy: Start" Apr 22 18:20:28.787813 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.787760 2561 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:20:28.787813 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.787770 2561 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:20:28.795874 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.795810 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-95.ec2.internal.18a8c0ca75754744 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-95.ec2.internal,UID:ip-10-0-143-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-143-95.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-143-95.ec2.internal,},FirstTimestamp:2026-04-22 18:20:28.784207684 +0000 UTC m=+0.400848803,LastTimestamp:2026-04-22 18:20:28.784207684 +0000 UTC m=+0.400848803,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-95.ec2.internal,}" Apr 22 18:20:28.803992 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.803974 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lckv2" Apr 22 18:20:28.806126 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.806029 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-95.ec2.internal.18a8c0ca75758bff default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-95.ec2.internal,UID:ip-10-0-143-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-143-95.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-143-95.ec2.internal,},FirstTimestamp:2026-04-22 18:20:28.784225279 +0000 UTC m=+0.400866397,LastTimestamp:2026-04-22 18:20:28.784225279 +0000 UTC m=+0.400866397,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-95.ec2.internal,}" Apr 22 18:20:28.813114 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.813092 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lckv2" Apr 22 18:20:28.814129 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.814056 2561 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-143-95.ec2.internal.18a8c0ca7575b17d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-143-95.ec2.internal,UID:ip-10-0-143-95.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-143-95.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-143-95.ec2.internal,},FirstTimestamp:2026-04-22 18:20:28.784234877 +0000 UTC m=+0.400876000,LastTimestamp:2026-04-22 18:20:28.784234877 +0000 UTC m=+0.400876000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-143-95.ec2.internal,}" Apr 22 18:20:28.821104 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.821091 2561 manager.go:341] "Starting Device Plugin manager" Apr 22 18:20:28.821161 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.821126 2561 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:20:28.821161 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.821139 2561 server.go:85] "Starting device plugin registration server" Apr 22 18:20:28.821469 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.821454 2561 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:20:28.821567 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.821468 2561 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:20:28.821667 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.821647 2561 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:20:28.821724 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.821717 2561 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:20:28.821775 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.821725 2561 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:20:28.822262 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.822235 2561 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:20:28.822342 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.822277 2561 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:28.892504 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.892472 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:20:28.894670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.893691 2561 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:20:28.894670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.893716 2561 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:20:28.894670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.893734 2561 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:20:28.894670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.893742 2561 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:20:28.894670 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.893774 2561 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:20:28.896552 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.896532 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:28.922622 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.922559 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:28.924494 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.924476 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:28.924617 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.924528 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:28.924617 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.924543 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:28.924617 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.924572 2561 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-143-95.ec2.internal" Apr 22 18:20:28.931649 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.931633 2561 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-143-95.ec2.internal" Apr 22 18:20:28.931710 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.931655 2561 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-143-95.ec2.internal\": node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:28.954322 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:28.954300 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:28.994642 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.994617 2561 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal"] Apr 22 18:20:28.994713 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.994687 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:28.995621 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.995607 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:28.995707 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.995635 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:28.995707 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.995645 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:28.997833 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.997820 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:28.997983 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.997968 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" Apr 22 18:20:28.998045 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.998004 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:28.999563 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.999548 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:28.999563 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.999556 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:28.999699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.999579 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:28.999699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.999580 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:28.999699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.999595 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:28.999699 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:28.999606 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:29.001837 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.001822 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.001901 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.001846 2561 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:20:29.002789 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.002775 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:20:29.002859 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.002795 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:20:29.002859 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.002804 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:20:29.028167 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.028142 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-95.ec2.internal\" not found" node="ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.032408 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.032391 2561 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-143-95.ec2.internal\" not found" node="ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.054817 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.054800 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.067347 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.067318 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/03ab3bb00fd17a2fe851fecedb91531c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-95.ec2.internal\" (UID: \"03ab3bb00fd17a2fe851fecedb91531c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.067425 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.067361 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d15035323cabd0fb99c0ba1e4edb0b02-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal\" (UID: \"d15035323cabd0fb99c0ba1e4edb0b02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.067425 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.067383 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15035323cabd0fb99c0ba1e4edb0b02-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal\" (UID: \"d15035323cabd0fb99c0ba1e4edb0b02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.155391 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.155364 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.168133 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.168111 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/03ab3bb00fd17a2fe851fecedb91531c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-95.ec2.internal\" (UID: \"03ab3bb00fd17a2fe851fecedb91531c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.168197 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.168138 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d15035323cabd0fb99c0ba1e4edb0b02-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal\" (UID: \"d15035323cabd0fb99c0ba1e4edb0b02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.168197 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.168159 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15035323cabd0fb99c0ba1e4edb0b02-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal\" (UID: \"d15035323cabd0fb99c0ba1e4edb0b02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.168197 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.168192 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d15035323cabd0fb99c0ba1e4edb0b02-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal\" (UID: \"d15035323cabd0fb99c0ba1e4edb0b02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.168289 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.168202 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d15035323cabd0fb99c0ba1e4edb0b02-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal\" (UID: \"d15035323cabd0fb99c0ba1e4edb0b02\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.168289 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.168226 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/03ab3bb00fd17a2fe851fecedb91531c-config\") pod \"kube-apiserver-proxy-ip-10-0-143-95.ec2.internal\" (UID: \"03ab3bb00fd17a2fe851fecedb91531c\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.256026 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.255958 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.329408 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.329381 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.334018 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.333994 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.356501 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.356474 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.457013 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.456980 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.557539 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.557445 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.658162 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.658132 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.683571 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.683552 2561 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:20:29.684097 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.683703 2561 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:20:29.758699 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.758670 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.764428 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.764412 2561 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:20:29.768948 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.768929 2561 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:29.783326 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.783299 2561 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:20:29.816598 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.816534 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:15:28 +0000 UTC" deadline="2028-01-02 22:53:17.986845154 +0000 UTC" Apr 22 18:20:29.816598 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.816568 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14884h32m48.170280051s" Apr 22 18:20:29.859804 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:29.859777 2561 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-143-95.ec2.internal\" not found" Apr 22 18:20:29.876745 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.876718 2561 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:29.888794 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.888767 2561 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n67rv" Apr 22 18:20:29.897123 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.897096 2561 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n67rv" Apr 22 18:20:29.965242 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.965214 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.976111 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:29.976076 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd15035323cabd0fb99c0ba1e4edb0b02.slice/crio-5de836cd36c4e57120b9c6f9d65b8aad5c68d5fb89f0979ea64fdd033a145dc4 WatchSource:0}: Error finding container 5de836cd36c4e57120b9c6f9d65b8aad5c68d5fb89f0979ea64fdd033a145dc4: Status 404 returned error can't find the container with id 5de836cd36c4e57120b9c6f9d65b8aad5c68d5fb89f0979ea64fdd033a145dc4 Apr 22 18:20:29.980437 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.980358 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:29.981913 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.981897 2561 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" Apr 22 18:20:29.982107 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:29.982092 2561 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:20:29.987611 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:29.987590 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03ab3bb00fd17a2fe851fecedb91531c.slice/crio-0abb6259a3b3881a4488d2b14a7f057e352c790b57f96f39223ccba35379194a WatchSource:0}: Error finding container 0abb6259a3b3881a4488d2b14a7f057e352c790b57f96f39223ccba35379194a: Status 404 returned error can't find the container with id 0abb6259a3b3881a4488d2b14a7f057e352c790b57f96f39223ccba35379194a Apr 22 18:20:30.001939 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.001914 2561 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:20:30.231723 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.231691 2561 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:30.746753 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.746582 2561 apiserver.go:52] "Watching apiserver" Apr 22 18:20:30.757988 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.757960 2561 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:20:30.759109 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.759059 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal","openshift-network-diagnostics/network-check-target-sf25h","openshift-network-operator/iptables-alerter-xs7wj","openshift-ovn-kubernetes/ovnkube-node-tp5cv","kube-system/konnectivity-agent-9dcbx","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq","openshift-cluster-node-tuning-operator/tuned-bd65z","openshift-dns/node-resolver-mnvl5","openshift-image-registry/node-ca-wxxx2","openshift-multus/multus-additional-cni-plugins-cbzsc","openshift-multus/multus-lcrhm","openshift-multus/network-metrics-daemon-44prk","kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal"] Apr 22 18:20:30.764131 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.764108 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.766281 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.766255 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:30.766686 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.766457 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:30.768617 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.768597 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.769553 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.769534 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:30.770166 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.769794 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:30.770166 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.769887 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:20:30.770166 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.769887 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jts9f\"" Apr 22 18:20:30.773035 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.771074 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.773035 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.771186 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:30.773035 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.771644 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:20:30.773987 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.773553 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.774995 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.774968 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:20:30.775141 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.775102 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:20:30.776558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776533 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-systemd\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776646 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776582 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-sys\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776646 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776608 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-host\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776646 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776629 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-tuned\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776653 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c37bfcbd-6bd8-4a75-8c85-a5436d184894-tmp\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776680 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kjm\" (UniqueName: \"kubernetes.io/projected/c37bfcbd-6bd8-4a75-8c85-a5436d184894-kube-api-access-b8kjm\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776713 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d5a8446b-4f32-4b50-b5eb-2657be43dc10-iptables-alerter-script\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776736 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-kubernetes\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776757 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-run\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776777 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a8446b-4f32-4b50-b5eb-2657be43dc10-host-slash\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776818 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhl4\" (UniqueName: \"kubernetes.io/projected/d5a8446b-4f32-4b50-b5eb-2657be43dc10-kube-api-access-qvhl4\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.776849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776842 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-modprobe-d\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.777345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776863 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysctl-conf\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.777345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776884 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-lib-modules\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.777345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776906 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-var-lib-kubelet\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.777345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776954 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:30.777345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.776985 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysconfig\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.777345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.777008 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysctl-d\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.777811 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.777779 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kqgx2\"" Apr 22 18:20:30.777874 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.777838 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:20:30.777969 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.777787 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:20:30.778041 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.778024 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:20:30.778173 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.778156 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:20:30.778233 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.778217 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:20:30.778488 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.778471 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2w9sz\"" Apr 22 18:20:30.778876 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.778858 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.779038 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.779024 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:20:30.779112 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.779093 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-l4w5z\"" Apr 22 18:20:30.779564 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.779545 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:20:30.779643 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.779601 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:20:30.779701 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.779552 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:20:30.779764 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.779732 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-245xn\"" Apr 22 18:20:30.779922 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.779905 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:20:30.782504 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.782484 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.782695 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.782678 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.783279 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.783258 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:20:30.784380 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.784361 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:20:30.784760 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.784704 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:20:30.785611 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.785076 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-c4q6z\"" Apr 22 18:20:30.785611 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.785362 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:20:30.787036 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.786951 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:20:30.787126 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.787037 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:20:30.788742 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.787213 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xbbvw\"" Apr 22 18:20:30.788742 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.787252 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:20:30.788742 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.787425 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:20:30.788742 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.787533 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:20:30.788742 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.788010 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:20:30.788742 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.788060 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-q64zh\"" Apr 22 18:20:30.789483 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.789463 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:30.789653 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.789617 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:30.789950 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.789935 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.795806 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.795782 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-dpjth\"" Apr 22 18:20:30.796330 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.796197 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:20:30.866258 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.866235 2561 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:20:30.877981 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.877948 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-modprobe-d\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878111 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.877988 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-lib-modules\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878111 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878019 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-run-netns\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.878111 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878045 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovn-node-metrics-cert\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.878111 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878070 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-hosts-file\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.878111 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878098 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-systemd\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878122 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-tuned\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878147 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-node-log\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878153 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-modprobe-d\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878154 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-lib-modules\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878172 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-cni-bin\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878192 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878213 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-env-overrides\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878233 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-socket-dir-parent\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878236 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-systemd\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878249 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptssw\" (UniqueName: \"kubernetes.io/projected/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-kube-api-access-ptssw\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878265 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878354 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-run\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.878405 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878384 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-kubelet\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878422 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-run\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878459 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-socket-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878484 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-os-release\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878527 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-tmp-dir\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878541 2561 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878554 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysctl-conf\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878581 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-etc-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878641 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878663 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4067f04-ceb3-492b-a98c-80b9c869cc01-host\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878693 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-multus-certs\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878706 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysctl-conf\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878721 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysconfig\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878746 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c37bfcbd-6bd8-4a75-8c85-a5436d184894-tmp\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878771 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovnkube-config\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878796 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9x46\" (UniqueName: \"kubernetes.io/projected/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-kube-api-access-g9x46\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878794 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysconfig\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.879030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878825 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdplz\" (UniqueName: \"kubernetes.io/projected/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-kube-api-access-zdplz\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878850 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-cnibin\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878874 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-netns\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878896 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-etc-kubernetes\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878925 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d5a8446b-4f32-4b50-b5eb-2657be43dc10-iptables-alerter-script\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878951 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-systemd\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878967 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-cni-binary-copy\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.878984 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a8446b-4f32-4b50-b5eb-2657be43dc10-host-slash\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879009 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhl4\" (UniqueName: \"kubernetes.io/projected/d5a8446b-4f32-4b50-b5eb-2657be43dc10-kube-api-access-qvhl4\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879044 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-etc-selinux\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879075 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879102 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879130 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-cnibin\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879164 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-hostroot\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879198 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-conf-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879221 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4067f04-ceb3-492b-a98c-80b9c869cc01-serviceca\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.879826 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879247 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-system-cni-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879279 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-k8s-cni-cncf-io\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879305 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5ae596f0-7dd8-44a5-bece-be47e95773a0-agent-certs\") pod \"konnectivity-agent-9dcbx\" (UID: \"5ae596f0-7dd8-44a5-bece-be47e95773a0\") " pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879335 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-host\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879363 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-registration-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879392 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqk4r\" (UniqueName: \"kubernetes.io/projected/f617d906-31ed-45b2-ad64-99d0315fed58-kube-api-access-sqk4r\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879417 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-daemon-config\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879443 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kjm\" (UniqueName: \"kubernetes.io/projected/c37bfcbd-6bd8-4a75-8c85-a5436d184894-kube-api-access-b8kjm\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879474 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879499 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-device-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879541 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbf7l\" (UniqueName: \"kubernetes.io/projected/9981a1fc-cec1-4187-82c7-f8a291c71356-kube-api-access-zbf7l\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879574 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-kubernetes\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879598 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-log-socket\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879624 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879639 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d5a8446b-4f32-4b50-b5eb-2657be43dc10-iptables-alerter-script\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879649 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-sys-fs\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.880558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879687 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-system-cni-dir\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879714 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-var-lib-kubelet\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879844 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d5a8446b-4f32-4b50-b5eb-2657be43dc10-host-slash\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.879858 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-kubernetes\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880010 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-var-lib-kubelet\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880058 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-systemd-units\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880078 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-host\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880097 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-slash\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880134 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-var-lib-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880193 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-os-release\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880229 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-cni-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880257 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880304 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880330 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysctl-d\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880441 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-sys\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880485 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-ovn\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880527 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-cni-netd\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.881238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880554 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9981a1fc-cec1-4187-82c7-f8a291c71356-cni-binary-copy\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880559 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-sys\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880560 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-sysctl-d\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880596 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-cni-bin\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880636 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-cni-multus\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880691 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v6fh\" (UniqueName: \"kubernetes.io/projected/c4067f04-ceb3-492b-a98c-80b9c869cc01-kube-api-access-7v6fh\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880744 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-kubelet\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880781 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rb5t\" (UniqueName: \"kubernetes.io/projected/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-kube-api-access-7rb5t\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880819 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovnkube-script-lib\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.882182 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.880848 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5ae596f0-7dd8-44a5-bece-be47e95773a0-konnectivity-ca\") pod \"konnectivity-agent-9dcbx\" (UID: \"5ae596f0-7dd8-44a5-bece-be47e95773a0\") " pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:30.882631 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.882375 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c37bfcbd-6bd8-4a75-8c85-a5436d184894-etc-tuned\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.884086 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.884059 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c37bfcbd-6bd8-4a75-8c85-a5436d184894-tmp\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.888170 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.888146 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:30.888339 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.888178 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:30.888339 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.888192 2561 projected.go:194] Error preparing data for projected volume kube-api-access-pb99p for pod openshift-network-diagnostics/network-check-target-sf25h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:30.888339 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.888259 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p podName:8ad229f6-99cd-4eac-9f27-b8ae51b8bde3 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:31.388238247 +0000 UTC m=+3.004879357 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pb99p" (UniqueName: "kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p") pod "network-check-target-sf25h" (UID: "8ad229f6-99cd-4eac-9f27-b8ae51b8bde3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:30.891637 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.891457 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kjm\" (UniqueName: \"kubernetes.io/projected/c37bfcbd-6bd8-4a75-8c85-a5436d184894-kube-api-access-b8kjm\") pod \"tuned-bd65z\" (UID: \"c37bfcbd-6bd8-4a75-8c85-a5436d184894\") " pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:30.891637 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.891590 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhl4\" (UniqueName: \"kubernetes.io/projected/d5a8446b-4f32-4b50-b5eb-2657be43dc10-kube-api-access-qvhl4\") pod \"iptables-alerter-xs7wj\" (UID: \"d5a8446b-4f32-4b50-b5eb-2657be43dc10\") " pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:30.897791 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.897761 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:29 +0000 UTC" deadline="2028-01-03 03:02:51.322800926 +0000 UTC" Apr 22 18:20:30.897791 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.897790 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14888h42m20.425014138s" Apr 22 18:20:30.899358 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.899308 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" event={"ID":"03ab3bb00fd17a2fe851fecedb91531c","Type":"ContainerStarted","Data":"0abb6259a3b3881a4488d2b14a7f057e352c790b57f96f39223ccba35379194a"} Apr 22 18:20:30.900866 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.900845 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" event={"ID":"d15035323cabd0fb99c0ba1e4edb0b02","Type":"ContainerStarted","Data":"5de836cd36c4e57120b9c6f9d65b8aad5c68d5fb89f0979ea64fdd033a145dc4"} Apr 22 18:20:30.981463 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981362 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.981463 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981407 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4067f04-ceb3-492b-a98c-80b9c869cc01-host\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.981463 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981435 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-multus-certs\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.981463 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981440 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981474 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovnkube-config\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981490 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c4067f04-ceb3-492b-a98c-80b9c869cc01-host\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981499 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9x46\" (UniqueName: \"kubernetes.io/projected/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-kube-api-access-g9x46\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981524 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-multus-certs\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981554 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdplz\" (UniqueName: \"kubernetes.io/projected/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-kube-api-access-zdplz\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981582 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-cnibin\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981597 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-netns\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981620 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-etc-kubernetes\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981642 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-systemd\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981654 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-cnibin\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981662 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-netns\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981678 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-etc-kubernetes\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981705 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-systemd\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981666 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-cni-binary-copy\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981739 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-etc-selinux\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981755 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.981803 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981776 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981879 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-cnibin\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981917 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-hostroot\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981931 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-cnibin\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981941 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-conf-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981863 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-etc-selinux\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981968 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4067f04-ceb3-492b-a98c-80b9c869cc01-serviceca\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981985 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-hostroot\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.981997 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-system-cni-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982020 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-k8s-cni-cncf-io\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982022 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-conf-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982087 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-system-cni-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982126 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-run-k8s-cni-cncf-io\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982154 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5ae596f0-7dd8-44a5-bece-be47e95773a0-agent-certs\") pod \"konnectivity-agent-9dcbx\" (UID: \"5ae596f0-7dd8-44a5-bece-be47e95773a0\") " pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982214 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-cni-binary-copy\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982182 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-registration-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982220 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovnkube-config\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982259 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqk4r\" (UniqueName: \"kubernetes.io/projected/f617d906-31ed-45b2-ad64-99d0315fed58-kube-api-access-sqk4r\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.982596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982313 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-registration-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982364 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-daemon-config\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982410 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982435 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-device-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982460 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbf7l\" (UniqueName: \"kubernetes.io/projected/9981a1fc-cec1-4187-82c7-f8a291c71356-kube-api-access-zbf7l\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982502 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-log-socket\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982550 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/c4067f04-ceb3-492b-a98c-80b9c869cc01-serviceca\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982553 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982588 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-run-ovn-kubernetes\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982622 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-sys-fs\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982656 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-system-cni-dir\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982681 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-device-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982692 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-systemd-units\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982728 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-slash\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982768 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-var-lib-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982794 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-os-release\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982821 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-cni-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.983398 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982852 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982913 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-ovn\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982940 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-cni-netd\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982971 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9981a1fc-cec1-4187-82c7-f8a291c71356-cni-binary-copy\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982987 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-cni-bin\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983001 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-cni-multus\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983025 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7v6fh\" (UniqueName: \"kubernetes.io/projected/c4067f04-ceb3-492b-a98c-80b9c869cc01-kube-api-access-7v6fh\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983049 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-kubelet\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983056 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983063 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f617d906-31ed-45b2-ad64-99d0315fed58-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.982627 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983065 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rb5t\" (UniqueName: \"kubernetes.io/projected/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-kube-api-access-7rb5t\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983121 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-system-cni-dir\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983207 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-os-release\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983246 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-cni-netd\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983314 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-cni-dir\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983374 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovnkube-script-lib\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.983381 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:30.984195 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983464 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-log-socket\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983494 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5ae596f0-7dd8-44a5-bece-be47e95773a0-konnectivity-ca\") pod \"konnectivity-agent-9dcbx\" (UID: \"5ae596f0-7dd8-44a5-bece-be47e95773a0\") " pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983525 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-kubelet\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983536 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-var-lib-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983563 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-run-ovn\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983578 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-run-netns\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983540 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-run-netns\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983640 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9981a1fc-cec1-4187-82c7-f8a291c71356-cni-binary-copy\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983660 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovn-node-metrics-cert\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983694 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-hosts-file\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983721 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-node-log\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983762 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-cni-bin\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983819 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-node-log\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983822 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-slash\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983850 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-cni-multus\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983898 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-cni-bin\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983903 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-daemon-config\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.984952 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:30.983944 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs podName:c854aae6-d913-46c5-9cec-ae4b5f6e8ff7 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:31.48390859 +0000 UTC m=+3.100549715 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs") pod "network-metrics-daemon-44prk" (UID: "c854aae6-d913-46c5-9cec-ae4b5f6e8ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983952 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-systemd-units\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.983968 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984011 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-env-overrides\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984070 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-hosts-file\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984117 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984126 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/5ae596f0-7dd8-44a5-bece-be47e95773a0-konnectivity-ca\") pod \"konnectivity-agent-9dcbx\" (UID: \"5ae596f0-7dd8-44a5-bece-be47e95773a0\") " pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984177 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-socket-dir-parent\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984182 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovnkube-script-lib\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984239 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-sys-fs\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984240 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-host-var-lib-cni-bin\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984288 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptssw\" (UniqueName: \"kubernetes.io/projected/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-kube-api-access-ptssw\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984297 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-multus-socket-dir-parent\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.984362 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.985172 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-env-overrides\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.985439 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-kubelet\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.985684 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-socket-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.985785 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.985724 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-os-release\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.986544 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.985908 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/5ae596f0-7dd8-44a5-bece-be47e95773a0-agent-certs\") pod \"konnectivity-agent-9dcbx\" (UID: \"5ae596f0-7dd8-44a5-bece-be47e95773a0\") " pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:30.986544 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.986123 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-tmp-dir\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.986544 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.986160 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-etc-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.986544 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.986183 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f617d906-31ed-45b2-ad64-99d0315fed58-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.986544 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.986426 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-ovn-node-metrics-cert\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.986757 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.986586 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-host-kubelet\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.986757 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.986683 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9981a1fc-cec1-4187-82c7-f8a291c71356-os-release\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.986846 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.986779 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-socket-dir\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.987068 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.987049 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-tmp-dir\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.987115 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.987103 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-etc-openvswitch\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.991831 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.991690 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9x46\" (UniqueName: \"kubernetes.io/projected/d74da43e-5f0d-4fd5-94fc-934129e8ccc0-kube-api-access-g9x46\") pod \"ovnkube-node-tp5cv\" (UID: \"d74da43e-5f0d-4fd5-94fc-934129e8ccc0\") " pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:30.994061 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.994034 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdplz\" (UniqueName: \"kubernetes.io/projected/dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57-kube-api-access-zdplz\") pod \"aws-ebs-csi-driver-node-s2ddq\" (UID: \"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:30.994167 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.994119 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqk4r\" (UniqueName: \"kubernetes.io/projected/f617d906-31ed-45b2-ad64-99d0315fed58-kube-api-access-sqk4r\") pod \"multus-additional-cni-plugins-cbzsc\" (UID: \"f617d906-31ed-45b2-ad64-99d0315fed58\") " pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:30.994430 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.994406 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v6fh\" (UniqueName: \"kubernetes.io/projected/c4067f04-ceb3-492b-a98c-80b9c869cc01-kube-api-access-7v6fh\") pod \"node-ca-wxxx2\" (UID: \"c4067f04-ceb3-492b-a98c-80b9c869cc01\") " pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:30.994866 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.994837 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbf7l\" (UniqueName: \"kubernetes.io/projected/9981a1fc-cec1-4187-82c7-f8a291c71356-kube-api-access-zbf7l\") pod \"multus-lcrhm\" (UID: \"9981a1fc-cec1-4187-82c7-f8a291c71356\") " pod="openshift-multus/multus-lcrhm" Apr 22 18:20:30.995853 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.995832 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rb5t\" (UniqueName: \"kubernetes.io/projected/543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291-kube-api-access-7rb5t\") pod \"node-resolver-mnvl5\" (UID: \"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291\") " pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:30.998246 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:30.998184 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptssw\" (UniqueName: \"kubernetes.io/projected/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-kube-api-access-ptssw\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:31.029629 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.029598 2561 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:20:31.079249 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.079216 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xs7wj" Apr 22 18:20:31.093120 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.093094 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:31.104721 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.104694 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bd65z" Apr 22 18:20:31.109326 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.109307 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:31.118979 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.118957 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" Apr 22 18:20:31.126594 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.126573 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-wxxx2" Apr 22 18:20:31.142225 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.142197 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mnvl5" Apr 22 18:20:31.148868 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.148844 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" Apr 22 18:20:31.155492 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.155471 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lcrhm" Apr 22 18:20:31.388403 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.388333 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:31.388564 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:31.388449 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:31.388564 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:31.388465 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:31.388564 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:31.388476 2561 projected.go:194] Error preparing data for projected volume kube-api-access-pb99p for pod openshift-network-diagnostics/network-check-target-sf25h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:31.388564 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:31.388547 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p podName:8ad229f6-99cd-4eac-9f27-b8ae51b8bde3 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:32.388531653 +0000 UTC m=+4.005172763 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-pb99p" (UniqueName: "kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p") pod "network-check-target-sf25h" (UID: "8ad229f6-99cd-4eac-9f27-b8ae51b8bde3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:31.489202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.489165 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:31.489373 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:31.489292 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:31.489373 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:31.489355 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs podName:c854aae6-d913-46c5-9cec-ae4b5f6e8ff7 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:32.489336324 +0000 UTC m=+4.105977443 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs") pod "network-metrics-daemon-44prk" (UID: "c854aae6-d913-46c5-9cec-ae4b5f6e8ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:31.828373 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.828332 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ae596f0_7dd8_44a5_bece_be47e95773a0.slice/crio-a06a84c5235534f6d694a718b7d0bdb341f46637b3631f37c48bf3ebe40e0c90 WatchSource:0}: Error finding container a06a84c5235534f6d694a718b7d0bdb341f46637b3631f37c48bf3ebe40e0c90: Status 404 returned error can't find the container with id a06a84c5235534f6d694a718b7d0bdb341f46637b3631f37c48bf3ebe40e0c90 Apr 22 18:20:31.829743 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.829722 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4067f04_ceb3_492b_a98c_80b9c869cc01.slice/crio-97563a16a1e4b66c5b2375e860569aad0cacd624a6ee2acebd1cb649638bf169 WatchSource:0}: Error finding container 97563a16a1e4b66c5b2375e860569aad0cacd624a6ee2acebd1cb649638bf169: Status 404 returned error can't find the container with id 97563a16a1e4b66c5b2375e860569aad0cacd624a6ee2acebd1cb649638bf169 Apr 22 18:20:31.830462 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.830432 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcf7e1d6_c599_4b7d_ba0d_91c9cd65fa57.slice/crio-7a94b8ef7fb27a8d2dede79410728fd797f8381d58753560d80d477da41603b6 WatchSource:0}: Error finding container 7a94b8ef7fb27a8d2dede79410728fd797f8381d58753560d80d477da41603b6: Status 404 returned error can't find the container with id 7a94b8ef7fb27a8d2dede79410728fd797f8381d58753560d80d477da41603b6 Apr 22 18:20:31.835038 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.834996 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod543e99a2_8eb7_4ffb_b5ef_3e4ee83a0291.slice/crio-83225a38b23676f446801d7fec0391c6162b0cf64562238e0e10b96e1b1f4996 WatchSource:0}: Error finding container 83225a38b23676f446801d7fec0391c6162b0cf64562238e0e10b96e1b1f4996: Status 404 returned error can't find the container with id 83225a38b23676f446801d7fec0391c6162b0cf64562238e0e10b96e1b1f4996 Apr 22 18:20:31.835722 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.835665 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc37bfcbd_6bd8_4a75_8c85_a5436d184894.slice/crio-cda37a11adb44829ce53825907c2a83c0770fcd18b890af5e0902468b392714e WatchSource:0}: Error finding container cda37a11adb44829ce53825907c2a83c0770fcd18b890af5e0902468b392714e: Status 404 returned error can't find the container with id cda37a11adb44829ce53825907c2a83c0770fcd18b890af5e0902468b392714e Apr 22 18:20:31.842397 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.842376 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf617d906_31ed_45b2_ad64_99d0315fed58.slice/crio-a331c8e745cdba7276a8656e4ff0ca746dcb332db57d7ff17a95e3f2841c0ea3 WatchSource:0}: Error finding container a331c8e745cdba7276a8656e4ff0ca746dcb332db57d7ff17a95e3f2841c0ea3: Status 404 returned error can't find the container with id a331c8e745cdba7276a8656e4ff0ca746dcb332db57d7ff17a95e3f2841c0ea3 Apr 22 18:20:31.864658 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.864629 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9981a1fc_cec1_4187_82c7_f8a291c71356.slice/crio-eb0a811e8c1715819f6892f3a615609d12817898fb92b9ef1cf8ccf1170ea24a WatchSource:0}: Error finding container eb0a811e8c1715819f6892f3a615609d12817898fb92b9ef1cf8ccf1170ea24a: Status 404 returned error can't find the container with id eb0a811e8c1715819f6892f3a615609d12817898fb92b9ef1cf8ccf1170ea24a Apr 22 18:20:31.865474 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.865451 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5a8446b_4f32_4b50_b5eb_2657be43dc10.slice/crio-ab5bad6dc6480326604456e15c0dea28d47d0b1765f683060df53cfcb28ecf0d WatchSource:0}: Error finding container ab5bad6dc6480326604456e15c0dea28d47d0b1765f683060df53cfcb28ecf0d: Status 404 returned error can't find the container with id ab5bad6dc6480326604456e15c0dea28d47d0b1765f683060df53cfcb28ecf0d Apr 22 18:20:31.866753 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:20:31.866663 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd74da43e_5f0d_4fd5_94fc_934129e8ccc0.slice/crio-bdfd466b161db0c05a88f23687cd2e951702cb7b1e014518d81cf52da307494a WatchSource:0}: Error finding container bdfd466b161db0c05a88f23687cd2e951702cb7b1e014518d81cf52da307494a: Status 404 returned error can't find the container with id bdfd466b161db0c05a88f23687cd2e951702cb7b1e014518d81cf52da307494a Apr 22 18:20:31.898464 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.898430 2561 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:15:29 +0000 UTC" deadline="2027-10-16 09:15:50.198845061 +0000 UTC" Apr 22 18:20:31.898573 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.898464 2561 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12998h55m18.300384776s" Apr 22 18:20:31.903235 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.903207 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerStarted","Data":"a331c8e745cdba7276a8656e4ff0ca746dcb332db57d7ff17a95e3f2841c0ea3"} Apr 22 18:20:31.905949 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.905925 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" event={"ID":"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57","Type":"ContainerStarted","Data":"7a94b8ef7fb27a8d2dede79410728fd797f8381d58753560d80d477da41603b6"} Apr 22 18:20:31.909646 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.909624 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"bdfd466b161db0c05a88f23687cd2e951702cb7b1e014518d81cf52da307494a"} Apr 22 18:20:31.912256 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.912236 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xs7wj" event={"ID":"d5a8446b-4f32-4b50-b5eb-2657be43dc10","Type":"ContainerStarted","Data":"ab5bad6dc6480326604456e15c0dea28d47d0b1765f683060df53cfcb28ecf0d"} Apr 22 18:20:31.913223 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.913203 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mnvl5" event={"ID":"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291","Type":"ContainerStarted","Data":"83225a38b23676f446801d7fec0391c6162b0cf64562238e0e10b96e1b1f4996"} Apr 22 18:20:31.914097 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.914079 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wxxx2" event={"ID":"c4067f04-ceb3-492b-a98c-80b9c869cc01","Type":"ContainerStarted","Data":"97563a16a1e4b66c5b2375e860569aad0cacd624a6ee2acebd1cb649638bf169"} Apr 22 18:20:31.915002 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.914977 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9dcbx" event={"ID":"5ae596f0-7dd8-44a5-bece-be47e95773a0","Type":"ContainerStarted","Data":"a06a84c5235534f6d694a718b7d0bdb341f46637b3631f37c48bf3ebe40e0c90"} Apr 22 18:20:31.915900 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.915880 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bd65z" event={"ID":"c37bfcbd-6bd8-4a75-8c85-a5436d184894","Type":"ContainerStarted","Data":"cda37a11adb44829ce53825907c2a83c0770fcd18b890af5e0902468b392714e"} Apr 22 18:20:31.916696 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:31.916673 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lcrhm" event={"ID":"9981a1fc-cec1-4187-82c7-f8a291c71356","Type":"ContainerStarted","Data":"eb0a811e8c1715819f6892f3a615609d12817898fb92b9ef1cf8ccf1170ea24a"} Apr 22 18:20:32.394981 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.394913 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:32.395107 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.395029 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:32.395107 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.395052 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:32.395107 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.395065 2561 projected.go:194] Error preparing data for projected volume kube-api-access-pb99p for pod openshift-network-diagnostics/network-check-target-sf25h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:32.395256 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.395125 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p podName:8ad229f6-99cd-4eac-9f27-b8ae51b8bde3 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:34.395105348 +0000 UTC m=+6.011746469 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-pb99p" (UniqueName: "kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p") pod "network-check-target-sf25h" (UID: "8ad229f6-99cd-4eac-9f27-b8ae51b8bde3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:32.496911 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.496276 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:32.496911 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.496477 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:32.496911 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.496562 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs podName:c854aae6-d913-46c5-9cec-ae4b5f6e8ff7 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:34.49654177 +0000 UTC m=+6.113182897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs") pod "network-metrics-daemon-44prk" (UID: "c854aae6-d913-46c5-9cec-ae4b5f6e8ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:32.894734 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.894698 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:32.895180 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.894848 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:32.897607 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.897582 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:32.897742 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:32.897687 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:32.920653 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.920618 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" event={"ID":"03ab3bb00fd17a2fe851fecedb91531c","Type":"ContainerStarted","Data":"5aad062785515f930edbd5aa8d51728ebafdca2a4cadae805c77d316cbd73cc6"} Apr 22 18:20:32.927557 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.927052 2561 generic.go:358] "Generic (PLEG): container finished" podID="d15035323cabd0fb99c0ba1e4edb0b02" containerID="9b38fdf954c2816e5fd206fe6dd4799c1e8268c84323e0ff85d73cdf1f731f3a" exitCode=0 Apr 22 18:20:32.927557 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.927107 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" event={"ID":"d15035323cabd0fb99c0ba1e4edb0b02","Type":"ContainerDied","Data":"9b38fdf954c2816e5fd206fe6dd4799c1e8268c84323e0ff85d73cdf1f731f3a"} Apr 22 18:20:32.937921 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:32.937872 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-143-95.ec2.internal" podStartSLOduration=3.93785519 podStartE2EDuration="3.93785519s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:32.937275461 +0000 UTC m=+4.553916612" watchObservedRunningTime="2026-04-22 18:20:32.93785519 +0000 UTC m=+4.554496319" Apr 22 18:20:33.949603 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:33.949564 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" event={"ID":"d15035323cabd0fb99c0ba1e4edb0b02","Type":"ContainerStarted","Data":"f9dce0e3ca7e35675ac9f68d357f3e1f6bbbc82c8a6a33ab2ac0ffc983c07375"} Apr 22 18:20:34.411797 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:34.411757 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:34.412029 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.411977 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:34.412029 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.412001 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:34.412029 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.412013 2561 projected.go:194] Error preparing data for projected volume kube-api-access-pb99p for pod openshift-network-diagnostics/network-check-target-sf25h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:34.412207 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.412078 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p podName:8ad229f6-99cd-4eac-9f27-b8ae51b8bde3 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:38.412058623 +0000 UTC m=+10.028699764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-pb99p" (UniqueName: "kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p") pod "network-check-target-sf25h" (UID: "8ad229f6-99cd-4eac-9f27-b8ae51b8bde3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:34.512267 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:34.512228 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:34.512450 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.512390 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:34.512536 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.512455 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs podName:c854aae6-d913-46c5-9cec-ae4b5f6e8ff7 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:38.512437138 +0000 UTC m=+10.129078262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs") pod "network-metrics-daemon-44prk" (UID: "c854aae6-d913-46c5-9cec-ae4b5f6e8ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:34.895091 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:34.895006 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:34.895276 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.895145 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:34.895943 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:34.895916 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:34.896072 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:34.896050 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:36.894608 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:36.894096 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:36.894608 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:36.894229 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:36.894608 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:36.894602 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:36.895150 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:36.894711 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:38.445821 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:38.445777 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:38.446353 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.446302 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:38.446353 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.446345 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:38.446507 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.446360 2561 projected.go:194] Error preparing data for projected volume kube-api-access-pb99p for pod openshift-network-diagnostics/network-check-target-sf25h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:38.446604 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.446531 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p podName:8ad229f6-99cd-4eac-9f27-b8ae51b8bde3 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:46.446489106 +0000 UTC m=+18.063130214 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-pb99p" (UniqueName: "kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p") pod "network-check-target-sf25h" (UID: "8ad229f6-99cd-4eac-9f27-b8ae51b8bde3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:38.547228 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:38.547115 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:38.547405 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.547279 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:38.547405 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.547360 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs podName:c854aae6-d913-46c5-9cec-ae4b5f6e8ff7 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:46.547339744 +0000 UTC m=+18.163980854 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs") pod "network-metrics-daemon-44prk" (UID: "c854aae6-d913-46c5-9cec-ae4b5f6e8ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:38.895863 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:38.895144 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:38.895863 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.895251 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:38.895863 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:38.895666 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:38.895863 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:38.895767 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:40.894948 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:40.894858 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:40.894948 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:40.894884 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:40.895444 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:40.894991 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:40.895444 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:40.895429 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:42.894653 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:42.894616 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:42.895133 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:42.894625 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:42.895133 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:42.894733 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:42.895133 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:42.894841 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:43.972506 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:43.972457 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-143-95.ec2.internal" podStartSLOduration=14.972443538 podStartE2EDuration="14.972443538s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:20:33.96485156 +0000 UTC m=+5.581492689" watchObservedRunningTime="2026-04-22 18:20:43.972443538 +0000 UTC m=+15.589084683" Apr 22 18:20:43.973141 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:43.973114 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-crcs7"] Apr 22 18:20:43.976128 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:43.976104 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:43.976223 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:43.976188 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:44.088811 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.088776 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c1d8261-0db3-4d2b-808a-e6bfde776154-kubelet-config\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.088956 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.088823 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.088956 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.088846 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c1d8261-0db3-4d2b-808a-e6bfde776154-dbus\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.190082 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.190039 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.190232 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.190142 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c1d8261-0db3-4d2b-808a-e6bfde776154-dbus\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.190232 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:44.190205 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:44.190232 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.190222 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3c1d8261-0db3-4d2b-808a-e6bfde776154-dbus\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.190386 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.190249 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c1d8261-0db3-4d2b-808a-e6bfde776154-kubelet-config\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.190386 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:44.190264 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret podName:3c1d8261-0db3-4d2b-808a-e6bfde776154 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:44.69024744 +0000 UTC m=+16.306888564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret") pod "global-pull-secret-syncer-crcs7" (UID: "3c1d8261-0db3-4d2b-808a-e6bfde776154") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:44.190386 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.190307 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3c1d8261-0db3-4d2b-808a-e6bfde776154-kubelet-config\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.695192 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.695151 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:44.695417 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:44.695331 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:44.695417 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:44.695416 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret podName:3c1d8261-0db3-4d2b-808a-e6bfde776154 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:45.695394063 +0000 UTC m=+17.312035186 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret") pod "global-pull-secret-syncer-crcs7" (UID: "3c1d8261-0db3-4d2b-808a-e6bfde776154") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:44.894173 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.894140 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:44.894319 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:44.894261 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:44.894481 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:44.894455 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:44.894630 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:44.894602 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:45.701296 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:45.701256 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:45.701783 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:45.701406 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:45.701783 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:45.701473 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret podName:3c1d8261-0db3-4d2b-808a-e6bfde776154 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:47.701457686 +0000 UTC m=+19.318098791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret") pod "global-pull-secret-syncer-crcs7" (UID: "3c1d8261-0db3-4d2b-808a-e6bfde776154") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:45.894544 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:45.894488 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:45.894729 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:45.894628 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:46.508238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:46.508207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:46.508427 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.508370 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:20:46.508427 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.508397 2561 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:20:46.508427 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.508409 2561 projected.go:194] Error preparing data for projected volume kube-api-access-pb99p for pod openshift-network-diagnostics/network-check-target-sf25h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:46.508602 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.508467 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p podName:8ad229f6-99cd-4eac-9f27-b8ae51b8bde3 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.50845188 +0000 UTC m=+34.125092986 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-pb99p" (UniqueName: "kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p") pod "network-check-target-sf25h" (UID: "8ad229f6-99cd-4eac-9f27-b8ae51b8bde3") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:20:46.609286 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:46.609256 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:46.609450 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.609416 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:46.609492 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.609475 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs podName:c854aae6-d913-46c5-9cec-ae4b5f6e8ff7 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.609460677 +0000 UTC m=+34.226101782 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs") pod "network-metrics-daemon-44prk" (UID: "c854aae6-d913-46c5-9cec-ae4b5f6e8ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:20:46.894170 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:46.894083 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:46.894665 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.894227 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:46.894665 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:46.894237 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:46.894665 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:46.894344 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:47.716357 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:47.716311 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:47.716550 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:47.716469 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:47.716618 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:47.716555 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret podName:3c1d8261-0db3-4d2b-808a-e6bfde776154 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:51.716533898 +0000 UTC m=+23.333175023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret") pod "global-pull-secret-syncer-crcs7" (UID: "3c1d8261-0db3-4d2b-808a-e6bfde776154") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:47.894177 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:47.894140 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:47.894671 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:47.894275 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:48.895453 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.895215 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:48.896124 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.895263 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:48.896124 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:48.895576 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:48.896124 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:48.895607 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:48.979713 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.979684 2561 generic.go:358] "Generic (PLEG): container finished" podID="f617d906-31ed-45b2-ad64-99d0315fed58" containerID="63e74bece34ec9feae5100e99ad5fadd3803695b449581370bb69ce778f4585f" exitCode=0 Apr 22 18:20:48.979820 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.979765 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerDied","Data":"63e74bece34ec9feae5100e99ad5fadd3803695b449581370bb69ce778f4585f"} Apr 22 18:20:48.981112 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.981056 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" event={"ID":"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57","Type":"ContainerStarted","Data":"125ff2a351887267a8f11204c4affc4baa4e8cf6d853016ce025ee428a3d89f3"} Apr 22 18:20:48.982707 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.982691 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-acl-logging/0.log" Apr 22 18:20:48.982987 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.982969 2561 generic.go:358] "Generic (PLEG): container finished" podID="d74da43e-5f0d-4fd5-94fc-934129e8ccc0" containerID="d5b95ae44cb68d02710f0a6cd44da77339a7f9eec2c2f6827656d500eda9e7c1" exitCode=1 Apr 22 18:20:48.983047 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.983027 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"4e1ea21e4a72aa4858e05d0601824946f5cae156ec9fd4be22f996c79ecfedd9"} Apr 22 18:20:48.983047 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.983043 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerDied","Data":"d5b95ae44cb68d02710f0a6cd44da77339a7f9eec2c2f6827656d500eda9e7c1"} Apr 22 18:20:48.983122 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.983053 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"7a61d8b42aaec6bd8231d414dcd18d27d386df71901f48dc24cbf35381c6deed"} Apr 22 18:20:48.984181 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.984156 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mnvl5" event={"ID":"543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291","Type":"ContainerStarted","Data":"8a2d2bb55323f2d9d4b3c059e583c4438ddd2931499fde17e1e7388e6be68f3b"} Apr 22 18:20:48.985250 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.985229 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-wxxx2" event={"ID":"c4067f04-ceb3-492b-a98c-80b9c869cc01","Type":"ContainerStarted","Data":"937c3f6d839d0aa207cfddb7973e8505a6acc66ea8ca510f6e85edc408d8f528"} Apr 22 18:20:48.986469 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.986450 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9dcbx" event={"ID":"5ae596f0-7dd8-44a5-bece-be47e95773a0","Type":"ContainerStarted","Data":"2db8191c97382b01bc8938aa404b881fa51ceb5217e4393f96691a523bd90c9c"} Apr 22 18:20:48.987820 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.987800 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bd65z" event={"ID":"c37bfcbd-6bd8-4a75-8c85-a5436d184894","Type":"ContainerStarted","Data":"c28e6ee97061a6c543a97e163ad35dbe539ba3062e2f9187653aa3294e0c2a71"} Apr 22 18:20:48.989039 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:48.989005 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lcrhm" event={"ID":"9981a1fc-cec1-4187-82c7-f8a291c71356","Type":"ContainerStarted","Data":"2d1a63564b185fff0c651ff43c8c68288467518c3d92961aa2df2d93db863dc2"} Apr 22 18:20:49.018026 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.017983 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bd65z" podStartSLOduration=3.406594576 podStartE2EDuration="20.017970463s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.837507354 +0000 UTC m=+3.454148460" lastFinishedPulling="2026-04-22 18:20:48.448883234 +0000 UTC m=+20.065524347" observedRunningTime="2026-04-22 18:20:49.017744947 +0000 UTC m=+20.634386074" watchObservedRunningTime="2026-04-22 18:20:49.017970463 +0000 UTC m=+20.634611591" Apr 22 18:20:49.046542 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.046488 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-wxxx2" podStartSLOduration=11.187149594 podStartE2EDuration="20.046475458s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.833998116 +0000 UTC m=+3.450639238" lastFinishedPulling="2026-04-22 18:20:40.693323985 +0000 UTC m=+12.309965102" observedRunningTime="2026-04-22 18:20:49.031113735 +0000 UTC m=+20.647754865" watchObservedRunningTime="2026-04-22 18:20:49.046475458 +0000 UTC m=+20.663116585" Apr 22 18:20:49.046956 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.046933 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9dcbx" podStartSLOduration=3.429030788 podStartE2EDuration="20.046928035s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.830959366 +0000 UTC m=+3.447600473" lastFinishedPulling="2026-04-22 18:20:48.448856614 +0000 UTC m=+20.065497720" observedRunningTime="2026-04-22 18:20:49.046847887 +0000 UTC m=+20.663489014" watchObservedRunningTime="2026-04-22 18:20:49.046928035 +0000 UTC m=+20.663569162" Apr 22 18:20:49.107125 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.107080 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mnvl5" podStartSLOduration=3.495299429 podStartE2EDuration="20.107065346s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.837065828 +0000 UTC m=+3.453706935" lastFinishedPulling="2026-04-22 18:20:48.448831732 +0000 UTC m=+20.065472852" observedRunningTime="2026-04-22 18:20:49.070035043 +0000 UTC m=+20.686676184" watchObservedRunningTime="2026-04-22 18:20:49.107065346 +0000 UTC m=+20.723706473" Apr 22 18:20:49.107307 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.107284 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lcrhm" podStartSLOduration=3.390850108 podStartE2EDuration="20.10727734s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.866268174 +0000 UTC m=+3.482909292" lastFinishedPulling="2026-04-22 18:20:48.582695409 +0000 UTC m=+20.199336524" observedRunningTime="2026-04-22 18:20:49.106647814 +0000 UTC m=+20.723288942" watchObservedRunningTime="2026-04-22 18:20:49.10727734 +0000 UTC m=+20.723918467" Apr 22 18:20:49.894897 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.894867 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:49.895018 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:49.894998 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:49.992775 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.992749 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-acl-logging/0.log" Apr 22 18:20:49.993303 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.993100 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"9455ceabea5f4429b2365e2002402b16e7911e3fc4c10e703b670c56c60124f5"} Apr 22 18:20:49.993303 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.993127 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"3fcaf6c38d4f993cd1a2d4ad1c59a371a6b59b6455a8729c56f63c8b6e50fb51"} Apr 22 18:20:49.993303 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.993138 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"2e5b2bf4a845cb534a484ed10d7653b7945659152cea0df305ef3e5cb50efa06"} Apr 22 18:20:49.994378 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:49.994352 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xs7wj" event={"ID":"d5a8446b-4f32-4b50-b5eb-2657be43dc10","Type":"ContainerStarted","Data":"6bf9127acee9f9152a7748ca0b161743ae8eec3627378c5ab172857a522dde7f"} Apr 22 18:20:50.014851 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.014802 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xs7wj" podStartSLOduration=5.434959104 podStartE2EDuration="22.014785417s" podCreationTimestamp="2026-04-22 18:20:28 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.869004039 +0000 UTC m=+3.485645145" lastFinishedPulling="2026-04-22 18:20:48.448830341 +0000 UTC m=+20.065471458" observedRunningTime="2026-04-22 18:20:50.014237703 +0000 UTC m=+21.630878836" watchObservedRunningTime="2026-04-22 18:20:50.014785417 +0000 UTC m=+21.631426545" Apr 22 18:20:50.207782 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.207751 2561 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:20:50.831488 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.831395 2561 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:20:50.207776674Z","UUID":"6661137f-3911-492e-a2df-c8d4286f9442","Handler":null,"Name":"","Endpoint":""} Apr 22 18:20:50.833375 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.833349 2561 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:20:50.833375 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.833381 2561 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:20:50.894851 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.894823 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:50.895012 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:50.894926 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:50.895058 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.895014 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:50.895135 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:50.895114 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:50.997782 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:50.997747 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" event={"ID":"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57","Type":"ContainerStarted","Data":"33fb79ccefbd085362ee8a2934365acb327efd131e4fb7e4d21f91acbe580621"} Apr 22 18:20:51.744615 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:51.744397 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:51.744792 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:51.744566 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:51.744792 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:51.744748 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret podName:3c1d8261-0db3-4d2b-808a-e6bfde776154 nodeName:}" failed. No retries permitted until 2026-04-22 18:20:59.744727956 +0000 UTC m=+31.361369069 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret") pod "global-pull-secret-syncer-crcs7" (UID: "3c1d8261-0db3-4d2b-808a-e6bfde776154") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:51.894636 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:51.894603 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:51.894807 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:51.894722 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:52.003076 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:52.003002 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-acl-logging/0.log" Apr 22 18:20:52.003443 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:52.003356 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"90c48d5883dc36b2f9acf19af28035acbc49cefd8cc8fb511db9a43fae5c8be1"} Apr 22 18:20:52.894228 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:52.894188 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:52.894412 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:52.894188 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:52.894412 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:52.894347 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:52.894412 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:52.894367 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:53.231105 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:53.231026 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:53.231731 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:53.231710 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:53.415944 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:53.415912 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:53.416553 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:53.416530 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9dcbx" Apr 22 18:20:53.895093 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:53.894787 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:53.895222 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:53.895122 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:54.008736 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.008704 2561 generic.go:358] "Generic (PLEG): container finished" podID="f617d906-31ed-45b2-ad64-99d0315fed58" containerID="c912e236a285ea9b19790bfe1b4eb64cb6a6ddf2474a80081bdcb4c5cfba5039" exitCode=0 Apr 22 18:20:54.008878 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.008792 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerDied","Data":"c912e236a285ea9b19790bfe1b4eb64cb6a6ddf2474a80081bdcb4c5cfba5039"} Apr 22 18:20:54.010755 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.010733 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" event={"ID":"dcf7e1d6-c599-4b7d-ba0d-91c9cd65fa57","Type":"ContainerStarted","Data":"b665aacc569d0a1e8653295b81b1fe0ab733848e11b23f93b280030dc7a58675"} Apr 22 18:20:54.013588 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.013570 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-acl-logging/0.log" Apr 22 18:20:54.013907 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.013889 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"984123b6e395e2c2b3674ab5b7d9d0d9dcb557db1212c37cf34664f23a8a3091"} Apr 22 18:20:54.014159 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.014140 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:54.014218 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.014169 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:54.014299 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.014280 2561 scope.go:117] "RemoveContainer" containerID="d5b95ae44cb68d02710f0a6cd44da77339a7f9eec2c2f6827656d500eda9e7c1" Apr 22 18:20:54.029943 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.029924 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:54.030044 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.030009 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:54.894655 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.894618 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:54.894655 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:54.894643 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:54.895256 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:54.894763 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:54.895256 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:54.894893 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:55.017372 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.017338 2561 generic.go:358] "Generic (PLEG): container finished" podID="f617d906-31ed-45b2-ad64-99d0315fed58" containerID="3612c53d2642f4b0ec554b32917c6d36ab8548b9d0a75edcae841fd63ad88672" exitCode=0 Apr 22 18:20:55.017558 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.017425 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerDied","Data":"3612c53d2642f4b0ec554b32917c6d36ab8548b9d0a75edcae841fd63ad88672"} Apr 22 18:20:55.020758 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.020743 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-acl-logging/0.log" Apr 22 18:20:55.021114 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.021089 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" event={"ID":"d74da43e-5f0d-4fd5-94fc-934129e8ccc0","Type":"ContainerStarted","Data":"ae108ebcfc725350eab9dc981d260c8426687d4bc38d707facf4098a7da5c8ce"} Apr 22 18:20:55.021190 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.021181 2561 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:20:55.040837 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.040801 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s2ddq" podStartSLOduration=4.241830022 podStartE2EDuration="26.040789437s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.836149565 +0000 UTC m=+3.452790686" lastFinishedPulling="2026-04-22 18:20:53.63510899 +0000 UTC m=+25.251750101" observedRunningTime="2026-04-22 18:20:54.088931179 +0000 UTC m=+25.705572309" watchObservedRunningTime="2026-04-22 18:20:55.040789437 +0000 UTC m=+26.657430565" Apr 22 18:20:55.066417 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.066371 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" podStartSLOduration=9.316120739 podStartE2EDuration="26.066358986s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.868869877 +0000 UTC m=+3.485510987" lastFinishedPulling="2026-04-22 18:20:48.619108123 +0000 UTC m=+20.235749234" observedRunningTime="2026-04-22 18:20:55.064721238 +0000 UTC m=+26.681362389" watchObservedRunningTime="2026-04-22 18:20:55.066358986 +0000 UTC m=+26.683000113" Apr 22 18:20:55.651857 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.651828 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sf25h"] Apr 22 18:20:55.652063 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.651915 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:55.652063 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:55.652006 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:55.655613 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.655588 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-44prk"] Apr 22 18:20:55.655738 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.655690 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:55.655851 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:55.655807 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:55.656345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.656319 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-crcs7"] Apr 22 18:20:55.656588 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:55.656421 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:55.656588 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:55.656530 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:56.019185 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:56.019101 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:20:56.027070 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:56.027042 2561 generic.go:358] "Generic (PLEG): container finished" podID="f617d906-31ed-45b2-ad64-99d0315fed58" containerID="083fe475f9b1a9283cb9db10cafb9d60d44b2e37dad6137953f2be2c8fad5969" exitCode=0 Apr 22 18:20:56.027211 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:56.027114 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerDied","Data":"083fe475f9b1a9283cb9db10cafb9d60d44b2e37dad6137953f2be2c8fad5969"} Apr 22 18:20:56.894730 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:56.894700 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:56.894930 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:56.894705 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:56.894930 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:56.894845 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:56.894930 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:56.894917 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:57.894469 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:57.894438 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:57.894842 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:57.894578 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:20:58.897868 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:58.896577 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:20:58.897868 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:58.896748 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:20:58.897868 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:58.897207 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:58.897868 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:58.897336 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:20:59.808036 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:59.807995 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:20:59.808210 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:59.808173 2561 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:59.808290 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:59.808252 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret podName:3c1d8261-0db3-4d2b-808a-e6bfde776154 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:15.808233097 +0000 UTC m=+47.424874225 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret") pod "global-pull-secret-syncer-crcs7" (UID: "3c1d8261-0db3-4d2b-808a-e6bfde776154") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:20:59.894411 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:20:59.894379 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:20:59.894590 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:20:59.894487 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-sf25h" podUID="8ad229f6-99cd-4eac-9f27-b8ae51b8bde3" Apr 22 18:21:00.894422 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:00.894390 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:21:00.894890 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:00.894392 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:21:00.894890 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:00.894500 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-44prk" podUID="c854aae6-d913-46c5-9cec-ae4b5f6e8ff7" Apr 22 18:21:00.894890 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:00.894608 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-crcs7" podUID="3c1d8261-0db3-4d2b-808a-e6bfde776154" Apr 22 18:21:01.239284 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.239205 2561 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-143-95.ec2.internal" event="NodeReady" Apr 22 18:21:01.239466 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.239452 2561 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:21:01.302132 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.302101 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fk8pv"] Apr 22 18:21:01.306274 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.306246 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.308479 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.308455 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-65444dcf5-96pv2"] Apr 22 18:21:01.310538 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.310490 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"console-operator-dockercfg-ckxnh\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ckxnh\"" type="*v1.Secret" Apr 22 18:21:01.310822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.310779 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"serving-cert\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" type="*v1.Secret" Apr 22 18:21:01.310822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.310795 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"console-operator-config\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" type="*v1.ConfigMap" Apr 22 18:21:01.311424 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.311271 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7"] Apr 22 18:21:01.311424 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.311284 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" type="*v1.ConfigMap" Apr 22 18:21:01.311424 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.311299 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 22 18:21:01.311424 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.311339 2561 status_manager.go:895] "Failed to get status for pod" podUID="d9adb985-4468-4c75-8d62-db92f367d26a" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" err="pods \"console-operator-9d4b6777b-fk8pv\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" Apr 22 18:21:01.311714 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.311433 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-console-operator\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 22 18:21:01.311714 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.311543 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.315824 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.314646 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2sbhl"] Apr 22 18:21:01.315824 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.314994 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"registry-dockercfg-4bvf4\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4bvf4\"" type="*v1.Secret" Apr 22 18:21:01.315824 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.315085 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"image-registry-private-configuration\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" type="*v1.Secret" Apr 22 18:21:01.315824 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.315138 2561 status_manager.go:895] "Failed to get status for pod" podUID="a9564463-99d6-488c-ac26-bee01a2bbb0d" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" err="pods \"image-registry-65444dcf5-96pv2\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" Apr 22 18:21:01.315824 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.315251 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"installation-pull-secrets\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" type="*v1.Secret" Apr 22 18:21:01.315824 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.315387 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:01.318314 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.318279 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg"] Apr 22 18:21:01.318419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.318374 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:21:01.318419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.318402 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.319240 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.319216 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.319240 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.319233 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-r7d2d\"" Apr 22 18:21:01.319411 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.319316 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"trusted-ca\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" type="*v1.ConfigMap" Apr 22 18:21:01.320199 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.319702 2561 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"image-registry-tls\" is forbidden: User \"system:node:ip-10-0-143-95.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'ip-10-0-143-95.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" type="*v1.Secret" Apr 22 18:21:01.322294 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.322273 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs"] Apr 22 18:21:01.322385 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.322321 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.322576 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.322562 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.325281 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.325004 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc"] Apr 22 18:21:01.325281 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.325205 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" Apr 22 18:21:01.327756 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.327737 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-65fc44b94d-k2qvj"] Apr 22 18:21:01.327849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.327812 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.329005 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.328980 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.329097 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.328982 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.329275 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.329253 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.330186 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.330165 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.330273 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.330166 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 18:21:01.331032 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.330536 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm"] Apr 22 18:21:01.331032 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.330677 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.331250 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.331233 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 18:21:01.331774 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.331755 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.333676 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.333658 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.344423 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.344397 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-wzkwn\"" Apr 22 18:21:01.344423 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.344398 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-f6g79\"" Apr 22 18:21:01.345522 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.345487 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.345729 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.345714 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 18:21:01.347162 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347143 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:21:01.347261 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347174 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-k5msw\"" Apr 22 18:21:01.347385 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347367 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.347487 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347408 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9"] Apr 22 18:21:01.347487 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347431 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 18:21:01.347630 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347591 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.347723 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347704 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-sg8lg\"" Apr 22 18:21:01.347890 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.347792 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.348261 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.348212 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.348442 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.348425 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 18:21:01.348591 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.348473 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 18:21:01.348651 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.348623 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 18:21:01.349584 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.349562 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 18:21:01.350419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.349722 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:21:01.350419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.349783 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:21:01.350419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.349786 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.350419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.349849 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-czd76\"" Apr 22 18:21:01.350419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.350267 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-j8hgn\"" Apr 22 18:21:01.350789 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.350632 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.350789 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.350710 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm"] Apr 22 18:21:01.350894 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.350874 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" Apr 22 18:21:01.350984 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.350968 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:21:01.352400 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.352367 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7"] Apr 22 18:21:01.352827 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.352807 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l"] Apr 22 18:21:01.355104 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.355086 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 18:21:01.356017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.355998 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:01.359053 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.359004 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fk8pv"] Apr 22 18:21:01.371642 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.371497 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-9mqk2\"" Apr 22 18:21:01.371937 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.371919 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:21:01.372713 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.372690 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2sbhl"] Apr 22 18:21:01.373220 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.373202 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:21:01.373651 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.373632 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-66ht9"] Apr 22 18:21:01.373736 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.373678 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.373952 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.373938 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-7hcdf\"" Apr 22 18:21:01.377543 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.377473 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:01.388031 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.388007 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.395662 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.395324 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:21:01.395662 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.395643 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:21:01.395815 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.395796 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:21:01.395972 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.395955 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwl6t\"" Apr 22 18:21:01.406860 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.406839 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg"] Apr 22 18:21:01.413622 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.413507 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65fc44b94d-k2qvj"] Apr 22 18:21:01.415568 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.415388 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65444dcf5-96pv2"] Apr 22 18:21:01.415568 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.415420 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l"] Apr 22 18:21:01.417456 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.417405 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-66ht9"] Apr 22 18:21:01.418011 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.417991 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9"] Apr 22 18:21:01.418797 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.418781 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs"] Apr 22 18:21:01.419622 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419588 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc"] Apr 22 18:21:01.419725 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419654 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8s9tw\" (UniqueName: \"kubernetes.io/projected/4280d58f-b305-45fb-a79c-389e20a9cd66-kube-api-access-8s9tw\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.419725 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419689 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1876a404-9b16-4840-ba70-f6e585f28d86-service-ca-bundle\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.419725 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419716 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2acdde8-b24c-4983-b9f7-961b896d0102-config\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.419876 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419743 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6c11aaf-3f61-4ced-8377-07f284493875-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.419876 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419769 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.419876 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419796 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzv5q\" (UniqueName: \"kubernetes.io/projected/1876a404-9b16-4840-ba70-f6e585f28d86-kube-api-access-zzv5q\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.419876 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419840 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.420078 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419874 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.420078 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419908 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2acdde8-b24c-4983-b9f7-961b896d0102-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.420078 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.419969 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx2gc\" (UniqueName: \"kubernetes.io/projected/d9adb985-4468-4c75-8d62-db92f367d26a-kube-api-access-hx2gc\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.420078 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420007 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9adb985-4468-4c75-8d62-db92f367d26a-serving-cert\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.420078 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420033 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9adb985-4468-4c75-8d62-db92f367d26a-trusted-ca\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.420078 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420057 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq26s\" (UniqueName: \"kubernetes.io/projected/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-kube-api-access-mq26s\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.420399 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420138 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.420399 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420173 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-certificates\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.420399 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420225 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1876a404-9b16-4840-ba70-f6e585f28d86-snapshots\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.420399 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420283 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-stats-auth\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.420399 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420311 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66z6j\" (UniqueName: \"kubernetes.io/projected/f6c11aaf-3f61-4ced-8377-07f284493875-kube-api-access-66z6j\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.420399 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420339 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsrww\" (UniqueName: \"kubernetes.io/projected/a2acdde8-b24c-4983-b9f7-961b896d0102-kube-api-access-bsrww\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.420399 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420389 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7h76j\" (UniqueName: \"kubernetes.io/projected/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-kube-api-access-7h76j\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420418 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9adb985-4468-4c75-8d62-db92f367d26a-config\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420443 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-bound-sa-token\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420479 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420533 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c11aaf-3f61-4ced-8377-07f284493875-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420573 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1876a404-9b16-4840-ba70-f6e585f28d86-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420599 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9564463-99d6-488c-ac26-bee01a2bbb0d-ca-trust-extracted\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420623 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-trusted-ca\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420647 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420673 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1876a404-9b16-4840-ba70-f6e585f28d86-tmp\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420697 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:01.420802 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420782 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1876a404-9b16-4840-ba70-f6e585f28d86-serving-cert\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.421233 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420820 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbzl2\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-kube-api-access-fbzl2\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.421233 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420848 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv82m\" (UniqueName: \"kubernetes.io/projected/3ade402b-25f9-4705-9e38-c812058fd982-kube-api-access-cv82m\") pod \"volume-data-source-validator-7c6cbb6c87-q6bgs\" (UID: \"3ade402b-25f9-4705-9e38-c812058fd982\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" Apr 22 18:21:01.421233 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420891 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.421233 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.420922 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-default-certificate\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.427742 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.427705 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-294lm"] Apr 22 18:21:01.431234 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.431211 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.434849 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.434830 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:21:01.434945 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.434858 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:21:01.435241 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.435223 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-69x95\"" Apr 22 18:21:01.446980 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.446945 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-294lm"] Apr 22 18:21:01.521798 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521719 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1876a404-9b16-4840-ba70-f6e585f28d86-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.521798 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521753 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9564463-99d6-488c-ac26-bee01a2bbb0d-ca-trust-extracted\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.521798 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521781 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-config-volume\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521808 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4v7g\" (UniqueName: \"kubernetes.io/projected/319485bf-3dcb-4995-b853-56ed38442a76-kube-api-access-z4v7g\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521835 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-trusted-ca\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521858 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521884 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1876a404-9b16-4840-ba70-f6e585f28d86-tmp\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.521961 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522003 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1876a404-9b16-4840-ba70-f6e585f28d86-serving-cert\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522033 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbzl2\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-kube-api-access-fbzl2\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522060 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv82m\" (UniqueName: \"kubernetes.io/projected/3ade402b-25f9-4705-9e38-c812058fd982-kube-api-access-cv82m\") pod \"volume-data-source-validator-7c6cbb6c87-q6bgs\" (UID: \"3ade402b-25f9-4705-9e38-c812058fd982\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" Apr 22 18:21:01.522071 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.522067 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522086 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.522128 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls podName:10a6275f-3d55-41df-9ed8-7ff7d65b52cf nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.022107363 +0000 UTC m=+33.638748486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pcct7" (UID: "10a6275f-3d55-41df-9ed8-7ff7d65b52cf") : secret "samples-operator-tls" not found Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522152 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-default-certificate\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522190 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8s9tw\" (UniqueName: \"kubernetes.io/projected/4280d58f-b305-45fb-a79c-389e20a9cd66-kube-api-access-8s9tw\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522200 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9564463-99d6-488c-ac26-bee01a2bbb0d-ca-trust-extracted\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522224 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1876a404-9b16-4840-ba70-f6e585f28d86-service-ca-bundle\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522253 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2acdde8-b24c-4983-b9f7-961b896d0102-config\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522266 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1876a404-9b16-4840-ba70-f6e585f28d86-tmp\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522284 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gn9h\" (UniqueName: \"kubernetes.io/projected/ede0fe22-46d7-48a6-93c9-92f84082afd4-kube-api-access-6gn9h\") pod \"network-check-source-8894fc9bd-g4tf9\" (UID: \"ede0fe22-46d7-48a6-93c9-92f84082afd4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522317 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6c11aaf-3f61-4ced-8377-07f284493875-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522345 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522374 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzv5q\" (UniqueName: \"kubernetes.io/projected/1876a404-9b16-4840-ba70-f6e585f28d86-kube-api-access-zzv5q\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522398 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522420 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.522539 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.522491 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.522556 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls podName:ee93315f-9c9b-4049-924b-51b8b2c9e9dc nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.022541973 +0000 UTC m=+33.639183078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g9jhc" (UID: "ee93315f-9c9b-4049-924b-51b8b2c9e9dc") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.522673 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.022661074 +0000 UTC m=+33.639302186 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : configmap references non-existent config key: service-ca.crt Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522775 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1876a404-9b16-4840-ba70-f6e585f28d86-service-ca-bundle\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522886 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522922 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2acdde8-b24c-4983-b9f7-961b896d0102-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522967 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2gc\" (UniqueName: \"kubernetes.io/projected/d9adb985-4468-4c75-8d62-db92f367d26a-kube-api-access-hx2gc\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522970 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1876a404-9b16-4840-ba70-f6e585f28d86-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.522995 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523025 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9adb985-4468-4c75-8d62-db92f367d26a-serving-cert\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523052 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9adb985-4468-4c75-8d62-db92f367d26a-trusted-ca\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523079 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mq26s\" (UniqueName: \"kubernetes.io/projected/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-kube-api-access-mq26s\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523107 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523133 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-certificates\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523171 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1876a404-9b16-4840-ba70-f6e585f28d86-snapshots\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.523221 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-stats-auth\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.523847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523197 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2acdde8-b24c-4983-b9f7-961b896d0102-config\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.523847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523235 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-66z6j\" (UniqueName: \"kubernetes.io/projected/f6c11aaf-3f61-4ced-8377-07f284493875-kube-api-access-66z6j\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.523847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523268 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsrww\" (UniqueName: \"kubernetes.io/projected/a2acdde8-b24c-4983-b9f7-961b896d0102-kube-api-access-bsrww\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.523847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523726 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7h76j\" (UniqueName: \"kubernetes.io/projected/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-kube-api-access-7h76j\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:01.523847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523764 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9adb985-4468-4c75-8d62-db92f367d26a-config\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:01.523847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523793 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-bound-sa-token\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.523847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523827 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523863 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523903 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523942 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c11aaf-3f61-4ced-8377-07f284493875-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.523971 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-tmp-dir\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.524002 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wtzd\" (UniqueName: \"kubernetes.io/projected/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-kube-api-access-4wtzd\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.524069 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/1876a404-9b16-4840-ba70-f6e585f28d86-snapshots\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.524089 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-certificates\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.524150 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.524122 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:21:01.524529 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.524187 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.024172315 +0000 UTC m=+33.640813432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : secret "router-metrics-certs-default" not found Apr 22 18:21:01.524980 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.524953 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.525123 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.525081 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6c11aaf-3f61-4ced-8377-07f284493875-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.526957 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.526932 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2acdde8-b24c-4983-b9f7-961b896d0102-serving-cert\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.527030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.526956 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6c11aaf-3f61-4ced-8377-07f284493875-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.527030 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.526939 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1876a404-9b16-4840-ba70-f6e585f28d86-serving-cert\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.527108 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.527067 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-default-certificate\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.527108 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.527080 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-stats-auth\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.542552 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.542503 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv82m\" (UniqueName: \"kubernetes.io/projected/3ade402b-25f9-4705-9e38-c812058fd982-kube-api-access-cv82m\") pod \"volume-data-source-validator-7c6cbb6c87-q6bgs\" (UID: \"3ade402b-25f9-4705-9e38-c812058fd982\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" Apr 22 18:21:01.542905 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.542886 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzv5q\" (UniqueName: \"kubernetes.io/projected/1876a404-9b16-4840-ba70-f6e585f28d86-kube-api-access-zzv5q\") pod \"insights-operator-585dfdc468-2sbhl\" (UID: \"1876a404-9b16-4840-ba70-f6e585f28d86\") " pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.545814 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.545790 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-66z6j\" (UniqueName: \"kubernetes.io/projected/f6c11aaf-3f61-4ced-8377-07f284493875-kube-api-access-66z6j\") pod \"kube-storage-version-migrator-operator-6769c5d45-sbpmg\" (UID: \"f6c11aaf-3f61-4ced-8377-07f284493875\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.547457 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.547434 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7h76j\" (UniqueName: \"kubernetes.io/projected/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-kube-api-access-7h76j\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:01.548861 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.548842 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-bound-sa-token\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.558668 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.558643 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsrww\" (UniqueName: \"kubernetes.io/projected/a2acdde8-b24c-4983-b9f7-961b896d0102-kube-api-access-bsrww\") pod \"service-ca-operator-d6fc45fc5-zrhmm\" (UID: \"a2acdde8-b24c-4983-b9f7-961b896d0102\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.559052 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.559026 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8s9tw\" (UniqueName: \"kubernetes.io/projected/4280d58f-b305-45fb-a79c-389e20a9cd66-kube-api-access-8s9tw\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:01.559646 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.559628 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbzl2\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-kube-api-access-fbzl2\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:01.560050 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.560020 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq26s\" (UniqueName: \"kubernetes.io/projected/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-kube-api-access-mq26s\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:01.624912 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.624881 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gn9h\" (UniqueName: \"kubernetes.io/projected/ede0fe22-46d7-48a6-93c9-92f84082afd4-kube-api-access-6gn9h\") pod \"network-check-source-8894fc9bd-g4tf9\" (UID: \"ede0fe22-46d7-48a6-93c9-92f84082afd4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" Apr 22 18:21:01.625091 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.624938 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:01.625091 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.624966 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.625654 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.625617 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:01.625654 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.625627 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:01.625839 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.625700 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert podName:319485bf-3dcb-4995-b853-56ed38442a76 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.125669527 +0000 UTC m=+33.742310653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert") pod "ingress-canary-66ht9" (UID: "319485bf-3dcb-4995-b853-56ed38442a76") : secret "canary-serving-cert" not found Apr 22 18:21:01.625839 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.625713 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:01.625839 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.625726 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:01.625839 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.625742 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:21:01.625839 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.625792 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert podName:ed95f2bf-02e7-48fb-a9ea-047c98cd7939 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.125778905 +0000 UTC m=+33.742420011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vlb5l" (UID: "ed95f2bf-02e7-48fb-a9ea-047c98cd7939") : secret "networking-console-plugin-cert" not found Apr 22 18:21:01.626072 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:01.625866 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls podName:44a332db-f3dc-4f80-a249-8ff0d0faa3ae nodeName:}" failed. No retries permitted until 2026-04-22 18:21:02.125846826 +0000 UTC m=+33.742487937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls") pod "dns-default-294lm" (UID: "44a332db-f3dc-4f80-a249-8ff0d0faa3ae") : secret "dns-default-metrics-tls" not found Apr 22 18:21:01.626072 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.625999 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-tmp-dir\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.626072 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.626026 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wtzd\" (UniqueName: \"kubernetes.io/projected/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-kube-api-access-4wtzd\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.626072 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.626057 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-config-volume\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.626254 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.626080 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4v7g\" (UniqueName: \"kubernetes.io/projected/319485bf-3dcb-4995-b853-56ed38442a76-kube-api-access-z4v7g\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:01.627615 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.626483 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:01.627615 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.626734 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-tmp-dir\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.629585 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.629560 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-config-volume\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.637843 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.637813 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gn9h\" (UniqueName: \"kubernetes.io/projected/ede0fe22-46d7-48a6-93c9-92f84082afd4-kube-api-access-6gn9h\") pod \"network-check-source-8894fc9bd-g4tf9\" (UID: \"ede0fe22-46d7-48a6-93c9-92f84082afd4\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" Apr 22 18:21:01.640883 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.640862 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wtzd\" (UniqueName: \"kubernetes.io/projected/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-kube-api-access-4wtzd\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:01.640883 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.640875 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4v7g\" (UniqueName: \"kubernetes.io/projected/319485bf-3dcb-4995-b853-56ed38442a76-kube-api-access-z4v7g\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:01.643024 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.642988 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-2sbhl" Apr 22 18:21:01.654015 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.653995 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" Apr 22 18:21:01.660705 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.660670 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" Apr 22 18:21:01.684534 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.684478 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" Apr 22 18:21:01.692003 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.691968 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" Apr 22 18:21:01.858491 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.858208 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg"] Apr 22 18:21:01.861970 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:01.861939 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c11aaf_3f61_4ced_8377_07f284493875.slice/crio-1adb04efc32e94efd2d80171fe76c98fde1d07bd4281a8f0f8b86cead612cbf2 WatchSource:0}: Error finding container 1adb04efc32e94efd2d80171fe76c98fde1d07bd4281a8f0f8b86cead612cbf2: Status 404 returned error can't find the container with id 1adb04efc32e94efd2d80171fe76c98fde1d07bd4281a8f0f8b86cead612cbf2 Apr 22 18:21:01.866167 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.866147 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs"] Apr 22 18:21:01.869277 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.869105 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-2sbhl"] Apr 22 18:21:01.871448 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:01.871423 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ade402b_25f9_4705_9e38_c812058fd982.slice/crio-afcf91612ce18511e86b6d090c86787e082e3699b96208598dd2d8361e683621 WatchSource:0}: Error finding container afcf91612ce18511e86b6d090c86787e082e3699b96208598dd2d8361e683621: Status 404 returned error can't find the container with id afcf91612ce18511e86b6d090c86787e082e3699b96208598dd2d8361e683621 Apr 22 18:21:01.872619 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:01.872594 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1876a404_9b16_4840_ba70_f6e585f28d86.slice/crio-cfa8bc7358c23779b6744ef00adb88b3378e24c2be8692be9f8d4dc8c37e0e3e WatchSource:0}: Error finding container cfa8bc7358c23779b6744ef00adb88b3378e24c2be8692be9f8d4dc8c37e0e3e: Status 404 returned error can't find the container with id cfa8bc7358c23779b6744ef00adb88b3378e24c2be8692be9f8d4dc8c37e0e3e Apr 22 18:21:01.883781 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.883745 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9"] Apr 22 18:21:01.886303 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:01.886269 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podede0fe22_46d7_48a6_93c9_92f84082afd4.slice/crio-1fd9bcf80452fc265ec41d4a84d88da8dd7bae7d3d9b8fd8cf2561516e4045fa WatchSource:0}: Error finding container 1fd9bcf80452fc265ec41d4a84d88da8dd7bae7d3d9b8fd8cf2561516e4045fa: Status 404 returned error can't find the container with id 1fd9bcf80452fc265ec41d4a84d88da8dd7bae7d3d9b8fd8cf2561516e4045fa Apr 22 18:21:01.892741 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.892717 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm"] Apr 22 18:21:01.894180 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.894160 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:21:01.896612 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:01.896588 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2acdde8_b24c_4983_b9f7_961b896d0102.slice/crio-33ae9e435c92ca5c8025b8f7c698b5c5808a078b34fd8553fbfd9eff0acc50ba WatchSource:0}: Error finding container 33ae9e435c92ca5c8025b8f7c698b5c5808a078b34fd8553fbfd9eff0acc50ba: Status 404 returned error can't find the container with id 33ae9e435c92ca5c8025b8f7c698b5c5808a078b34fd8553fbfd9eff0acc50ba Apr 22 18:21:01.897506 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:01.897486 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-zkrzn\"" Apr 22 18:21:02.033462 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.033368 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:02.033462 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.033437 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:02.033462 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.033460 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.033537 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.033567 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.033579 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.033558648 +0000 UTC m=+34.650199754 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : configmap references non-existent config key: service-ca.crt Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.033617 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.033629 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls podName:10a6275f-3d55-41df-9ed8-7ff7d65b52cf nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.033618598 +0000 UTC m=+34.650259703 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pcct7" (UID: "10a6275f-3d55-41df-9ed8-7ff7d65b52cf") : secret "samples-operator-tls" not found Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.033624 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.033675 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls podName:ee93315f-9c9b-4049-924b-51b8b2c9e9dc nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.033658057 +0000 UTC m=+34.650299169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g9jhc" (UID: "ee93315f-9c9b-4049-924b-51b8b2c9e9dc") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:02.033748 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.033709 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.033699688 +0000 UTC m=+34.650340821 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : secret "router-metrics-certs-default" not found Apr 22 18:21:02.038919 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.038888 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" event={"ID":"3ade402b-25f9-4705-9e38-c812058fd982","Type":"ContainerStarted","Data":"afcf91612ce18511e86b6d090c86787e082e3699b96208598dd2d8361e683621"} Apr 22 18:21:02.040053 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.040027 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" event={"ID":"ede0fe22-46d7-48a6-93c9-92f84082afd4","Type":"ContainerStarted","Data":"1fd9bcf80452fc265ec41d4a84d88da8dd7bae7d3d9b8fd8cf2561516e4045fa"} Apr 22 18:21:02.041114 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.041091 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2sbhl" event={"ID":"1876a404-9b16-4840-ba70-f6e585f28d86","Type":"ContainerStarted","Data":"cfa8bc7358c23779b6744ef00adb88b3378e24c2be8692be9f8d4dc8c37e0e3e"} Apr 22 18:21:02.042643 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.042596 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" event={"ID":"a2acdde8-b24c-4983-b9f7-961b896d0102","Type":"ContainerStarted","Data":"33ae9e435c92ca5c8025b8f7c698b5c5808a078b34fd8553fbfd9eff0acc50ba"} Apr 22 18:21:02.043900 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.043879 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" event={"ID":"f6c11aaf-3f61-4ced-8377-07f284493875","Type":"ContainerStarted","Data":"1adb04efc32e94efd2d80171fe76c98fde1d07bd4281a8f0f8b86cead612cbf2"} Apr 22 18:21:02.134675 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.134630 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:02.134835 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.134688 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:02.134835 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.134768 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:02.134835 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.134777 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:02.134835 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.134828 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:02.134968 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.134851 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:21:02.134968 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.134835 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert podName:319485bf-3dcb-4995-b853-56ed38442a76 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.134816676 +0000 UTC m=+34.751457803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert") pod "ingress-canary-66ht9" (UID: "319485bf-3dcb-4995-b853-56ed38442a76") : secret "canary-serving-cert" not found Apr 22 18:21:02.134968 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.134888 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls podName:44a332db-f3dc-4f80-a249-8ff0d0faa3ae nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.134873424 +0000 UTC m=+34.751514530 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls") pod "dns-default-294lm" (UID: "44a332db-f3dc-4f80-a249-8ff0d0faa3ae") : secret "dns-default-metrics-tls" not found Apr 22 18:21:02.134968 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.134903 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert podName:ed95f2bf-02e7-48fb-a9ea-047c98cd7939 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.134897632 +0000 UTC m=+34.751538738 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vlb5l" (UID: "ed95f2bf-02e7-48fb-a9ea-047c98cd7939") : secret "networking-console-plugin-cert" not found Apr 22 18:21:02.235427 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.235396 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 18:21:02.237253 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.237232 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:21:02.244919 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.244891 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9adb985-4468-4c75-8d62-db92f367d26a-trusted-ca\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:02.266813 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.266789 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 18:21:02.278186 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.278154 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d9adb985-4468-4c75-8d62-db92f367d26a-serving-cert\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:02.310830 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.310755 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:21:02.313127 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.313103 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-trusted-ca\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:02.387898 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.387865 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 18:21:02.395295 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.395266 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d9adb985-4468-4c75-8d62-db92f367d26a-config\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:02.522467 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.522431 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: failed to sync secret cache: timed out waiting for the condition Apr 22 18:21:02.522467 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.522465 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65444dcf5-96pv2: failed to sync secret cache: timed out waiting for the condition Apr 22 18:21:02.522709 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.522434 2561 secret.go:189] Couldn't get secret openshift-image-registry/installation-pull-secrets: failed to sync secret cache: timed out waiting for the condition Apr 22 18:21:02.522709 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.522548 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.022528975 +0000 UTC m=+34.639170097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : failed to sync secret cache: timed out waiting for the condition Apr 22 18:21:02.522709 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.522601 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.022579649 +0000 UTC m=+34.639220772 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "installation-pull-secrets" (UniqueName: "kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : failed to sync secret cache: timed out waiting for the condition Apr 22 18:21:02.522709 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.522645 2561 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-private-configuration: failed to sync secret cache: timed out waiting for the condition Apr 22 18:21:02.522709 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.522711 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.022694723 +0000 UTC m=+34.639335843 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-private-configuration" (UniqueName: "kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : failed to sync secret cache: timed out waiting for the condition Apr 22 18:21:02.539832 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.539625 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-ckxnh\"" Apr 22 18:21:02.539832 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.539678 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:21:02.542894 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.542670 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:21:02.543076 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.543028 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb99p\" (UniqueName: \"kubernetes.io/projected/8ad229f6-99cd-4eac-9f27-b8ae51b8bde3-kube-api-access-pb99p\") pod \"network-check-target-sf25h\" (UID: \"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3\") " pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:21:02.556287 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.556206 2561 projected.go:289] Couldn't get configMap openshift-console-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 22 18:21:02.556287 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.556246 2561 projected.go:194] Error preparing data for projected volume kube-api-access-hx2gc for pod openshift-console-operator/console-operator-9d4b6777b-fk8pv: failed to sync configmap cache: timed out waiting for the condition Apr 22 18:21:02.556500 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.556310 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d9adb985-4468-4c75-8d62-db92f367d26a-kube-api-access-hx2gc podName:d9adb985-4468-4c75-8d62-db92f367d26a nodeName:}" failed. No retries permitted until 2026-04-22 18:21:03.056287712 +0000 UTC m=+34.672928832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hx2gc" (UniqueName: "kubernetes.io/projected/d9adb985-4468-4c75-8d62-db92f367d26a-kube-api-access-hx2gc") pod "console-operator-9d4b6777b-fk8pv" (UID: "d9adb985-4468-4c75-8d62-db92f367d26a") : failed to sync configmap cache: timed out waiting for the condition Apr 22 18:21:02.569856 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.569769 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4bvf4\"" Apr 22 18:21:02.641412 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.641373 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:21:02.641632 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.641615 2561 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:02.641720 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:02.641680 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs podName:c854aae6-d913-46c5-9cec-ae4b5f6e8ff7 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:34.641661572 +0000 UTC m=+66.258302694 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs") pod "network-metrics-daemon-44prk" (UID: "c854aae6-d913-46c5-9cec-ae4b5f6e8ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:21:02.665268 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.664940 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 18:21:02.741051 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.740938 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:21:02.805828 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.805756 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:21:02.851819 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.851744 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:21:02.897958 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.895254 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:21:02.897958 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.896804 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:21:02.901238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.900738 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:21:02.901238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.901005 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bn55n\"" Apr 22 18:21:02.901238 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:02.901073 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.045310 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.045365 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.045418 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.045468 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.045499 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.045544 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.045566 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.045634 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls podName:10a6275f-3d55-41df-9ed8-7ff7d65b52cf nodeName:}" failed. No retries permitted until 2026-04-22 18:21:05.045612401 +0000 UTC m=+36.662253510 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pcct7" (UID: "10a6275f-3d55-41df-9ed8-7ff7d65b52cf") : secret "samples-operator-tls" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.045662 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.045698 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.045748 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls podName:ee93315f-9c9b-4049-924b-51b8b2c9e9dc nodeName:}" failed. No retries permitted until 2026-04-22 18:21:05.045732239 +0000 UTC m=+36.662373348 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g9jhc" (UID: "ee93315f-9c9b-4049-924b-51b8b2c9e9dc") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.045784 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.045819 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:05.04580727 +0000 UTC m=+36.662448379 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : secret "router-metrics-certs-default" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.046222 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.046237 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65444dcf5-96pv2: secret "image-registry-tls" not found Apr 22 18:21:03.047304 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.046275 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:04.046260351 +0000 UTC m=+35.662901463 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : secret "image-registry-tls" not found Apr 22 18:21:03.048226 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.046705 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:05.046688398 +0000 UTC m=+36.663329516 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : configmap references non-existent config key: service-ca.crt Apr 22 18:21:03.065223 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.065160 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:03.065223 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.065186 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.146451 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hx2gc\" (UniqueName: \"kubernetes.io/projected/d9adb985-4468-4c75-8d62-db92f367d26a-kube-api-access-hx2gc\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.146498 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.146584 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.146709 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.146743 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.146782 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.146805 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls podName:44a332db-f3dc-4f80-a249-8ff0d0faa3ae nodeName:}" failed. No retries permitted until 2026-04-22 18:21:05.146790284 +0000 UTC m=+36.763431390 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls") pod "dns-default-294lm" (UID: "44a332db-f3dc-4f80-a249-8ff0d0faa3ae") : secret "dns-default-metrics-tls" not found Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.146805 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.146841 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert podName:ed95f2bf-02e7-48fb-a9ea-047c98cd7939 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:05.146824594 +0000 UTC m=+36.763465719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vlb5l" (UID: "ed95f2bf-02e7-48fb-a9ea-047c98cd7939") : secret "networking-console-plugin-cert" not found Apr 22 18:21:03.146961 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:03.146859 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert podName:319485bf-3dcb-4995-b853-56ed38442a76 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:05.146849678 +0000 UTC m=+36.763490787 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert") pod "ingress-canary-66ht9" (UID: "319485bf-3dcb-4995-b853-56ed38442a76") : secret "canary-serving-cert" not found Apr 22 18:21:03.151677 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.151624 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx2gc\" (UniqueName: \"kubernetes.io/projected/d9adb985-4468-4c75-8d62-db92f367d26a-kube-api-access-hx2gc\") pod \"console-operator-9d4b6777b-fk8pv\" (UID: \"d9adb985-4468-4c75-8d62-db92f367d26a\") " pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:03.421138 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:03.421039 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:04.054855 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:04.054822 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:04.055303 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:04.054994 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:04.055303 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:04.055018 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65444dcf5-96pv2: secret "image-registry-tls" not found Apr 22 18:21:04.055303 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:04.055080 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:06.05506041 +0000 UTC m=+37.671701527 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : secret "image-registry-tls" not found Apr 22 18:21:05.063275 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:05.063240 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:05.063314 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:05.063344 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.063347 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:05.063391 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.063411 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls podName:10a6275f-3d55-41df-9ed8-7ff7d65b52cf nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.063392326 +0000 UTC m=+40.680033436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pcct7" (UID: "10a6275f-3d55-41df-9ed8-7ff7d65b52cf") : secret "samples-operator-tls" not found Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.063452 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.063440956 +0000 UTC m=+40.680082068 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : configmap references non-existent config key: service-ca.crt Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.063483 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.063483 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.063555 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls podName:ee93315f-9c9b-4049-924b-51b8b2c9e9dc nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.063538886 +0000 UTC m=+40.680180009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g9jhc" (UID: "ee93315f-9c9b-4049-924b-51b8b2c9e9dc") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:05.063822 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.063601 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.063582148 +0000 UTC m=+40.680223259 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : secret "router-metrics-certs-default" not found Apr 22 18:21:05.164898 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:05.164861 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:05.165072 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:05.164912 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:05.165072 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:05.164966 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:05.165072 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.165024 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:05.165230 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.165080 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:05.165230 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.165099 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert podName:319485bf-3dcb-4995-b853-56ed38442a76 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.165078478 +0000 UTC m=+40.781719588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert") pod "ingress-canary-66ht9" (UID: "319485bf-3dcb-4995-b853-56ed38442a76") : secret "canary-serving-cert" not found Apr 22 18:21:05.165230 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.165125 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:21:05.165230 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.165140 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls podName:44a332db-f3dc-4f80-a249-8ff0d0faa3ae nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.16512375 +0000 UTC m=+40.781764863 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls") pod "dns-default-294lm" (UID: "44a332db-f3dc-4f80-a249-8ff0d0faa3ae") : secret "dns-default-metrics-tls" not found Apr 22 18:21:05.165230 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:05.165168 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert podName:ed95f2bf-02e7-48fb-a9ea-047c98cd7939 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:09.165156907 +0000 UTC m=+40.781798026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vlb5l" (UID: "ed95f2bf-02e7-48fb-a9ea-047c98cd7939") : secret "networking-console-plugin-cert" not found Apr 22 18:21:06.075097 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:06.075058 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:06.075560 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:06.075220 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:06.075560 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:06.075238 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65444dcf5-96pv2: secret "image-registry-tls" not found Apr 22 18:21:06.075560 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:06.075293 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:10.075277229 +0000 UTC m=+41.691918335 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : secret "image-registry-tls" not found Apr 22 18:21:09.104288 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.104254 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.104342 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.104382 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.104402 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.104402 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.104482 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.104504 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.104486931 +0000 UTC m=+48.721128059 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : secret "router-metrics-certs-default" not found Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.104538 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.104528368 +0000 UTC m=+48.721169476 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : configmap references non-existent config key: service-ca.crt Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.104550 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls podName:10a6275f-3d55-41df-9ed8-7ff7d65b52cf nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.104544455 +0000 UTC m=+48.721185561 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pcct7" (UID: "10a6275f-3d55-41df-9ed8-7ff7d65b52cf") : secret "samples-operator-tls" not found Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.104569 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:09.104749 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.104626 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls podName:ee93315f-9c9b-4049-924b-51b8b2c9e9dc nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.104610956 +0000 UTC m=+48.721252074 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g9jhc" (UID: "ee93315f-9c9b-4049-924b-51b8b2c9e9dc") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:09.205291 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.205250 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:09.205447 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.205394 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:21:09.205492 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.205469 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert podName:ed95f2bf-02e7-48fb-a9ea-047c98cd7939 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.205451755 +0000 UTC m=+48.822092883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vlb5l" (UID: "ed95f2bf-02e7-48fb-a9ea-047c98cd7939") : secret "networking-console-plugin-cert" not found Apr 22 18:21:09.205574 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.205489 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:09.205574 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.205538 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:09.205679 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.205645 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:09.205726 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.205684 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert podName:319485bf-3dcb-4995-b853-56ed38442a76 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.205675767 +0000 UTC m=+48.822316876 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert") pod "ingress-canary-66ht9" (UID: "319485bf-3dcb-4995-b853-56ed38442a76") : secret "canary-serving-cert" not found Apr 22 18:21:09.205726 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.205645 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:09.205726 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:09.205712 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls podName:44a332db-f3dc-4f80-a249-8ff0d0faa3ae nodeName:}" failed. No retries permitted until 2026-04-22 18:21:17.205706845 +0000 UTC m=+48.822347951 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls") pod "dns-default-294lm" (UID: "44a332db-f3dc-4f80-a249-8ff0d0faa3ae") : secret "dns-default-metrics-tls" not found Apr 22 18:21:09.791069 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.791026 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-fk8pv"] Apr 22 18:21:09.804277 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:09.804232 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9adb985_4468_4c75_8d62_db92f367d26a.slice/crio-70222f0ad4afdefbc0a5b036f403be23bf0affa435cdeef1d714dc567b3b788b WatchSource:0}: Error finding container 70222f0ad4afdefbc0a5b036f403be23bf0affa435cdeef1d714dc567b3b788b: Status 404 returned error can't find the container with id 70222f0ad4afdefbc0a5b036f403be23bf0affa435cdeef1d714dc567b3b788b Apr 22 18:21:09.813231 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:09.813190 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-sf25h"] Apr 22 18:21:09.817181 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:09.817146 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ad229f6_99cd_4eac_9f27_b8ae51b8bde3.slice/crio-3990e791fe02fb0e973c26539a8fd9611477a3192130e47e6051dc51edfdc7dd WatchSource:0}: Error finding container 3990e791fe02fb0e973c26539a8fd9611477a3192130e47e6051dc51edfdc7dd: Status 404 returned error can't find the container with id 3990e791fe02fb0e973c26539a8fd9611477a3192130e47e6051dc51edfdc7dd Apr 22 18:21:10.064822 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.064321 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" event={"ID":"3ade402b-25f9-4705-9e38-c812058fd982","Type":"ContainerStarted","Data":"2542c8ab68c5a653d12295363ac798d1bae89beaf78c8ffe533585c90ee56733"} Apr 22 18:21:10.066532 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.066464 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" event={"ID":"ede0fe22-46d7-48a6-93c9-92f84082afd4","Type":"ContainerStarted","Data":"665b114c6ba8846d2117db918a61d0b8906423ad07c51192759137632ab15fc2"} Apr 22 18:21:10.068338 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.067980 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2sbhl" event={"ID":"1876a404-9b16-4840-ba70-f6e585f28d86","Type":"ContainerStarted","Data":"3a81c2fbbf0de14266c0228bb29ff2e0af67c6089cd6358cab1f9a3aa799e3e0"} Apr 22 18:21:10.071092 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.071065 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerStarted","Data":"c968baee130025ce41253169a8bd6e6743ba94b0bb6da731a2d654b811be18c9"} Apr 22 18:21:10.072543 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.072493 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" event={"ID":"a2acdde8-b24c-4983-b9f7-961b896d0102","Type":"ContainerStarted","Data":"3dc9a3d21cdf832c7da2e5d67588bec9ad1658cf5617b8ab75b114fc162d4bcb"} Apr 22 18:21:10.074094 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.074075 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" event={"ID":"f6c11aaf-3f61-4ced-8377-07f284493875","Type":"ContainerStarted","Data":"dcfafe973e3c27acc3cb8ed0bf7bf7871e08d7052c28f9e230a437b61cc94cad"} Apr 22 18:21:10.075197 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.075173 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" event={"ID":"d9adb985-4468-4c75-8d62-db92f367d26a","Type":"ContainerStarted","Data":"70222f0ad4afdefbc0a5b036f403be23bf0affa435cdeef1d714dc567b3b788b"} Apr 22 18:21:10.076772 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.076709 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sf25h" event={"ID":"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3","Type":"ContainerStarted","Data":"def9c6231e79785b3b3528aa60ba021be605635da3f73b515082f3756a22572d"} Apr 22 18:21:10.076772 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.076737 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-sf25h" event={"ID":"8ad229f6-99cd-4eac-9f27-b8ae51b8bde3","Type":"ContainerStarted","Data":"3990e791fe02fb0e973c26539a8fd9611477a3192130e47e6051dc51edfdc7dd"} Apr 22 18:21:10.076999 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.076972 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:21:10.085137 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.085090 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-q6bgs" podStartSLOduration=22.303452046 podStartE2EDuration="30.085078966s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:01.873656094 +0000 UTC m=+33.490297204" lastFinishedPulling="2026-04-22 18:21:09.655283017 +0000 UTC m=+41.271924124" observedRunningTime="2026-04-22 18:21:10.084171387 +0000 UTC m=+41.700812515" watchObservedRunningTime="2026-04-22 18:21:10.085078966 +0000 UTC m=+41.701720093" Apr 22 18:21:10.113697 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.113672 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:10.116310 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:10.115555 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:10.116310 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:10.115575 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65444dcf5-96pv2: secret "image-registry-tls" not found Apr 22 18:21:10.116310 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:10.115624 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:18.115607293 +0000 UTC m=+49.732248403 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : secret "image-registry-tls" not found Apr 22 18:21:10.124330 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.124281 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-sf25h" podStartSLOduration=42.124265591 podStartE2EDuration="42.124265591s" podCreationTimestamp="2026-04-22 18:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:10.122532293 +0000 UTC m=+41.739173434" watchObservedRunningTime="2026-04-22 18:21:10.124265591 +0000 UTC m=+41.740906720" Apr 22 18:21:10.192654 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.192557 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-g4tf9" podStartSLOduration=22.424588532 podStartE2EDuration="30.19253817s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:01.88834102 +0000 UTC m=+33.504982130" lastFinishedPulling="2026-04-22 18:21:09.656290652 +0000 UTC m=+41.272931768" observedRunningTime="2026-04-22 18:21:10.191564964 +0000 UTC m=+41.808206093" watchObservedRunningTime="2026-04-22 18:21:10.19253817 +0000 UTC m=+41.809179302" Apr 22 18:21:10.228959 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.228902 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" podStartSLOduration=22.435347787 podStartE2EDuration="30.228882146s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:01.864792756 +0000 UTC m=+33.481433875" lastFinishedPulling="2026-04-22 18:21:09.658327115 +0000 UTC m=+41.274968234" observedRunningTime="2026-04-22 18:21:10.228544693 +0000 UTC m=+41.845185822" watchObservedRunningTime="2026-04-22 18:21:10.228882146 +0000 UTC m=+41.845523275" Apr 22 18:21:10.283727 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.283670 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-2sbhl" podStartSLOduration=22.502259003 podStartE2EDuration="30.28364975s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:01.874630273 +0000 UTC m=+33.491271379" lastFinishedPulling="2026-04-22 18:21:09.656021006 +0000 UTC m=+41.272662126" observedRunningTime="2026-04-22 18:21:10.282335153 +0000 UTC m=+41.898976282" watchObservedRunningTime="2026-04-22 18:21:10.28364975 +0000 UTC m=+41.900290879" Apr 22 18:21:10.283886 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:10.283830 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" podStartSLOduration=22.520325211 podStartE2EDuration="30.283824468s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:01.898675539 +0000 UTC m=+33.515316645" lastFinishedPulling="2026-04-22 18:21:09.662174792 +0000 UTC m=+41.278815902" observedRunningTime="2026-04-22 18:21:10.252676763 +0000 UTC m=+41.869317894" watchObservedRunningTime="2026-04-22 18:21:10.283824468 +0000 UTC m=+41.900465592" Apr 22 18:21:11.081168 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.081133 2561 generic.go:358] "Generic (PLEG): container finished" podID="f617d906-31ed-45b2-ad64-99d0315fed58" containerID="c968baee130025ce41253169a8bd6e6743ba94b0bb6da731a2d654b811be18c9" exitCode=0 Apr 22 18:21:11.081342 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.081232 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerDied","Data":"c968baee130025ce41253169a8bd6e6743ba94b0bb6da731a2d654b811be18c9"} Apr 22 18:21:11.581591 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.581502 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh"] Apr 22 18:21:11.605275 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.605245 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh"] Apr 22 18:21:11.605440 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.605367 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" Apr 22 18:21:11.609111 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.609084 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:21:11.609754 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.609732 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:21:11.611551 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.611529 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-rjmn8\"" Apr 22 18:21:11.627544 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.627498 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z762r\" (UniqueName: \"kubernetes.io/projected/c90628b5-0064-4541-8207-f01d20c9be00-kube-api-access-z762r\") pod \"migrator-74bb7799d9-gszhh\" (UID: \"c90628b5-0064-4541-8207-f01d20c9be00\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" Apr 22 18:21:11.728815 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.728785 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z762r\" (UniqueName: \"kubernetes.io/projected/c90628b5-0064-4541-8207-f01d20c9be00-kube-api-access-z762r\") pod \"migrator-74bb7799d9-gszhh\" (UID: \"c90628b5-0064-4541-8207-f01d20c9be00\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" Apr 22 18:21:11.767731 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.767701 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z762r\" (UniqueName: \"kubernetes.io/projected/c90628b5-0064-4541-8207-f01d20c9be00-kube-api-access-z762r\") pod \"migrator-74bb7799d9-gszhh\" (UID: \"c90628b5-0064-4541-8207-f01d20c9be00\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" Apr 22 18:21:11.915531 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:11.915477 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" Apr 22 18:21:12.067940 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:12.067918 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh"] Apr 22 18:21:12.070098 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:12.070070 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc90628b5_0064_4541_8207_f01d20c9be00.slice/crio-a3db376529a9fad33efc7978dd6a658c1ae13a14c08414c6ea67c87c53d118c8 WatchSource:0}: Error finding container a3db376529a9fad33efc7978dd6a658c1ae13a14c08414c6ea67c87c53d118c8: Status 404 returned error can't find the container with id a3db376529a9fad33efc7978dd6a658c1ae13a14c08414c6ea67c87c53d118c8 Apr 22 18:21:12.086994 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:12.086965 2561 generic.go:358] "Generic (PLEG): container finished" podID="f617d906-31ed-45b2-ad64-99d0315fed58" containerID="cf248135183083fe1b553944c8b0c34a48b72b67a6b5f452e0cf8546aac26bee" exitCode=0 Apr 22 18:21:12.087101 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:12.087042 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerDied","Data":"cf248135183083fe1b553944c8b0c34a48b72b67a6b5f452e0cf8546aac26bee"} Apr 22 18:21:12.088770 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:12.088748 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" event={"ID":"c90628b5-0064-4541-8207-f01d20c9be00","Type":"ContainerStarted","Data":"a3db376529a9fad33efc7978dd6a658c1ae13a14c08414c6ea67c87c53d118c8"} Apr 22 18:21:13.094821 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.094791 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" event={"ID":"f617d906-31ed-45b2-ad64-99d0315fed58","Type":"ContainerStarted","Data":"b277c027d19c55a8ee7c2c4d80bc0e13cb3f5a9e4b20e77adcd46052bf9d4453"} Apr 22 18:21:13.125068 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.125019 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cbzsc" podStartSLOduration=6.3342871800000005 podStartE2EDuration="44.125005099s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:20:31.863367882 +0000 UTC m=+3.480008988" lastFinishedPulling="2026-04-22 18:21:09.654085797 +0000 UTC m=+41.270726907" observedRunningTime="2026-04-22 18:21:13.122876895 +0000 UTC m=+44.739518024" watchObservedRunningTime="2026-04-22 18:21:13.125005099 +0000 UTC m=+44.741646264" Apr 22 18:21:13.251747 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.251715 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5n6ps"] Apr 22 18:21:13.275428 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.275368 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5n6ps"] Apr 22 18:21:13.275643 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.275569 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.278869 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.278829 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 22 18:21:13.279003 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.278873 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 22 18:21:13.279003 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.278888 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qqhvl\"" Apr 22 18:21:13.279003 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.278830 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 22 18:21:13.279153 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.279074 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 22 18:21:13.347034 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.346951 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ceb1ab8c-1a88-49de-a56a-634a6fd85614-signing-cabundle\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.347034 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.347007 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxgqb\" (UniqueName: \"kubernetes.io/projected/ceb1ab8c-1a88-49de-a56a-634a6fd85614-kube-api-access-kxgqb\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.347233 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.347089 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ceb1ab8c-1a88-49de-a56a-634a6fd85614-signing-key\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.447882 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.447838 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ceb1ab8c-1a88-49de-a56a-634a6fd85614-signing-cabundle\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.448124 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.447905 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxgqb\" (UniqueName: \"kubernetes.io/projected/ceb1ab8c-1a88-49de-a56a-634a6fd85614-kube-api-access-kxgqb\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.448124 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.447937 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ceb1ab8c-1a88-49de-a56a-634a6fd85614-signing-key\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.448620 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.448593 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/ceb1ab8c-1a88-49de-a56a-634a6fd85614-signing-cabundle\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.451156 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.451132 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/ceb1ab8c-1a88-49de-a56a-634a6fd85614-signing-key\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.459020 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.458990 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxgqb\" (UniqueName: \"kubernetes.io/projected/ceb1ab8c-1a88-49de-a56a-634a6fd85614-kube-api-access-kxgqb\") pod \"service-ca-865cb79987-5n6ps\" (UID: \"ceb1ab8c-1a88-49de-a56a-634a6fd85614\") " pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.587482 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.587449 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-5n6ps" Apr 22 18:21:13.719968 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:13.719944 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-5n6ps"] Apr 22 18:21:13.722335 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:13.722305 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podceb1ab8c_1a88_49de_a56a_634a6fd85614.slice/crio-03fdcea5539ffe6a7e9cb850da97d52bc92e9558ce7909b12fe805f7ab3cb264 WatchSource:0}: Error finding container 03fdcea5539ffe6a7e9cb850da97d52bc92e9558ce7909b12fe805f7ab3cb264: Status 404 returned error can't find the container with id 03fdcea5539ffe6a7e9cb850da97d52bc92e9558ce7909b12fe805f7ab3cb264 Apr 22 18:21:14.099007 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:14.098979 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5n6ps" event={"ID":"ceb1ab8c-1a88-49de-a56a-634a6fd85614","Type":"ContainerStarted","Data":"03fdcea5539ffe6a7e9cb850da97d52bc92e9558ce7909b12fe805f7ab3cb264"} Apr 22 18:21:14.100358 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:14.100322 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fk8pv_d9adb985-4468-4c75-8d62-db92f367d26a/console-operator/0.log" Apr 22 18:21:14.100419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:14.100354 2561 generic.go:358] "Generic (PLEG): container finished" podID="d9adb985-4468-4c75-8d62-db92f367d26a" containerID="4aae600966dcdf1a7780d7840fd4c28351173b9839f697e21cb8fefb8dcb64a6" exitCode=255 Apr 22 18:21:14.100460 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:14.100429 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" event={"ID":"d9adb985-4468-4c75-8d62-db92f367d26a","Type":"ContainerDied","Data":"4aae600966dcdf1a7780d7840fd4c28351173b9839f697e21cb8fefb8dcb64a6"} Apr 22 18:21:14.100715 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:14.100695 2561 scope.go:117] "RemoveContainer" containerID="4aae600966dcdf1a7780d7840fd4c28351173b9839f697e21cb8fefb8dcb64a6" Apr 22 18:21:14.635044 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:14.635019 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mnvl5_543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291/dns-node-resolver/0.log" Apr 22 18:21:15.105018 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.104935 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" event={"ID":"c90628b5-0064-4541-8207-f01d20c9be00","Type":"ContainerStarted","Data":"f428db7b62418388324d764fb4d5382e1bd99da4e4eab4d5689e647493fa9253"} Apr 22 18:21:15.105018 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.104972 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" event={"ID":"c90628b5-0064-4541-8207-f01d20c9be00","Type":"ContainerStarted","Data":"3ac87adb8b5b5cbe7276f1a3955ca74dbd2b7b8374b89e47c799a11ac4aadbd8"} Apr 22 18:21:15.106352 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.106322 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-5n6ps" event={"ID":"ceb1ab8c-1a88-49de-a56a-634a6fd85614","Type":"ContainerStarted","Data":"9f345b055d72e7e52b1a1d85844b5ede54af1a2d8df94ca262902f5fd8033973"} Apr 22 18:21:15.107655 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.107635 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fk8pv_d9adb985-4468-4c75-8d62-db92f367d26a/console-operator/1.log" Apr 22 18:21:15.108010 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.107993 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fk8pv_d9adb985-4468-4c75-8d62-db92f367d26a/console-operator/0.log" Apr 22 18:21:15.108065 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.108033 2561 generic.go:358] "Generic (PLEG): container finished" podID="d9adb985-4468-4c75-8d62-db92f367d26a" containerID="f3dc82ab6ec2fff4814eb90b2a30988abc0cabe601a83aa45cfe41ac114e2a8f" exitCode=255 Apr 22 18:21:15.108098 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.108076 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" event={"ID":"d9adb985-4468-4c75-8d62-db92f367d26a","Type":"ContainerDied","Data":"f3dc82ab6ec2fff4814eb90b2a30988abc0cabe601a83aa45cfe41ac114e2a8f"} Apr 22 18:21:15.108131 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.108104 2561 scope.go:117] "RemoveContainer" containerID="4aae600966dcdf1a7780d7840fd4c28351173b9839f697e21cb8fefb8dcb64a6" Apr 22 18:21:15.108294 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.108276 2561 scope.go:117] "RemoveContainer" containerID="f3dc82ab6ec2fff4814eb90b2a30988abc0cabe601a83aa45cfe41ac114e2a8f" Apr 22 18:21:15.108448 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:15.108431 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fk8pv_openshift-console-operator(d9adb985-4468-4c75-8d62-db92f367d26a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" podUID="d9adb985-4468-4c75-8d62-db92f367d26a" Apr 22 18:21:15.128621 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.128562 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-gszhh" podStartSLOduration=1.800487578 podStartE2EDuration="4.128545341s" podCreationTimestamp="2026-04-22 18:21:11 +0000 UTC" firstStartedPulling="2026-04-22 18:21:12.072416819 +0000 UTC m=+43.689057925" lastFinishedPulling="2026-04-22 18:21:14.400474579 +0000 UTC m=+46.017115688" observedRunningTime="2026-04-22 18:21:15.126894321 +0000 UTC m=+46.743535451" watchObservedRunningTime="2026-04-22 18:21:15.128545341 +0000 UTC m=+46.745186466" Apr 22 18:21:15.159976 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.159928 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-5n6ps" podStartSLOduration=2.159911804 podStartE2EDuration="2.159911804s" podCreationTimestamp="2026-04-22 18:21:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:15.159502484 +0000 UTC m=+46.776143622" watchObservedRunningTime="2026-04-22 18:21:15.159911804 +0000 UTC m=+46.776552969" Apr 22 18:21:15.815428 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.815398 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wxxx2_c4067f04-ceb3-492b-a98c-80b9c869cc01/node-ca/0.log" Apr 22 18:21:15.870993 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.870954 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:21:15.873271 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:15.873248 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3c1d8261-0db3-4d2b-808a-e6bfde776154-original-pull-secret\") pod \"global-pull-secret-syncer-crcs7\" (UID: \"3c1d8261-0db3-4d2b-808a-e6bfde776154\") " pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:21:16.112069 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:16.112001 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fk8pv_d9adb985-4468-4c75-8d62-db92f367d26a/console-operator/1.log" Apr 22 18:21:16.112433 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:16.112370 2561 scope.go:117] "RemoveContainer" containerID="f3dc82ab6ec2fff4814eb90b2a30988abc0cabe601a83aa45cfe41ac114e2a8f" Apr 22 18:21:16.112618 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:16.112597 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fk8pv_openshift-console-operator(d9adb985-4468-4c75-8d62-db92f367d26a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" podUID="d9adb985-4468-4c75-8d62-db92f367d26a" Apr 22 18:21:16.113761 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:16.113749 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-crcs7" Apr 22 18:21:16.261202 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:16.261163 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-crcs7"] Apr 22 18:21:16.420470 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:16.420435 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gszhh_c90628b5-0064-4541-8207-f01d20c9be00/migrator/0.log" Apr 22 18:21:16.615271 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:16.615133 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gszhh_c90628b5-0064-4541-8207-f01d20c9be00/graceful-termination/0.log" Apr 22 18:21:16.835585 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:16.835449 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-sbpmg_f6c11aaf-3f61-4ced-8377-07f284493875/kube-storage-version-migrator-operator/0.log" Apr 22 18:21:17.119580 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.117845 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-crcs7" event={"ID":"3c1d8261-0db3-4d2b-808a-e6bfde776154","Type":"ContainerStarted","Data":"a159dcd27eaf960754b24b95bf649bfd8babeb79100b56d8da581c451705a084"} Apr 22 18:21:17.185997 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.185942 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.186019 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.186044 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.186079 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.186230 2561 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.186257 2561 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.186281 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:33.186259855 +0000 UTC m=+64.802900963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : configmap references non-existent config key: service-ca.crt Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.186307 2561 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.186308 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs podName:4280d58f-b305-45fb-a79c-389e20a9cd66 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:33.186298424 +0000 UTC m=+64.802939544 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs") pod "router-default-65fc44b94d-k2qvj" (UID: "4280d58f-b305-45fb-a79c-389e20a9cd66") : secret "router-metrics-certs-default" not found Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.186360 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls podName:ee93315f-9c9b-4049-924b-51b8b2c9e9dc nodeName:}" failed. No retries permitted until 2026-04-22 18:21:33.186349023 +0000 UTC m=+64.802990139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-g9jhc" (UID: "ee93315f-9c9b-4049-924b-51b8b2c9e9dc") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:21:17.186419 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.186372 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls podName:10a6275f-3d55-41df-9ed8-7ff7d65b52cf nodeName:}" failed. No retries permitted until 2026-04-22 18:21:33.18636553 +0000 UTC m=+64.803006636 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pcct7" (UID: "10a6275f-3d55-41df-9ed8-7ff7d65b52cf") : secret "samples-operator-tls" not found Apr 22 18:21:17.286843 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.286805 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:17.286843 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.286849 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:17.287111 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:17.286962 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:17.287111 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.286995 2561 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:21:17.287111 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.287060 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls podName:44a332db-f3dc-4f80-a249-8ff0d0faa3ae nodeName:}" failed. No retries permitted until 2026-04-22 18:21:33.287041178 +0000 UTC m=+64.903682290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls") pod "dns-default-294lm" (UID: "44a332db-f3dc-4f80-a249-8ff0d0faa3ae") : secret "dns-default-metrics-tls" not found Apr 22 18:21:17.287111 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.287080 2561 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:21:17.287374 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.287135 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert podName:ed95f2bf-02e7-48fb-a9ea-047c98cd7939 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:33.287120121 +0000 UTC m=+64.903761226 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-vlb5l" (UID: "ed95f2bf-02e7-48fb-a9ea-047c98cd7939") : secret "networking-console-plugin-cert" not found Apr 22 18:21:17.287374 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.287081 2561 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:21:17.287374 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:17.287176 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert podName:319485bf-3dcb-4995-b853-56ed38442a76 nodeName:}" failed. No retries permitted until 2026-04-22 18:21:33.287166074 +0000 UTC m=+64.903807181 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert") pod "ingress-canary-66ht9" (UID: "319485bf-3dcb-4995-b853-56ed38442a76") : secret "canary-serving-cert" not found Apr 22 18:21:18.195700 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:18.195665 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:18.196160 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:18.195819 2561 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:21:18.196160 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:18.195842 2561 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-65444dcf5-96pv2: secret "image-registry-tls" not found Apr 22 18:21:18.196160 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:18.195909 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls podName:a9564463-99d6-488c-ac26-bee01a2bbb0d nodeName:}" failed. No retries permitted until 2026-04-22 18:21:34.195887677 +0000 UTC m=+65.812528797 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls") pod "image-registry-65444dcf5-96pv2" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d") : secret "image-registry-tls" not found Apr 22 18:21:21.131564 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:21.131501 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-crcs7" event={"ID":"3c1d8261-0db3-4d2b-808a-e6bfde776154","Type":"ContainerStarted","Data":"26d15412d8baa34bb8906b86440038e94c018e0e1f1dc2ae6b718012a7fec03d"} Apr 22 18:21:21.149677 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:21.149625 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-crcs7" podStartSLOduration=34.028152958 podStartE2EDuration="38.149610517s" podCreationTimestamp="2026-04-22 18:20:43 +0000 UTC" firstStartedPulling="2026-04-22 18:21:16.271959978 +0000 UTC m=+47.888601098" lastFinishedPulling="2026-04-22 18:21:20.393417547 +0000 UTC m=+52.010058657" observedRunningTime="2026-04-22 18:21:21.14870684 +0000 UTC m=+52.765347971" watchObservedRunningTime="2026-04-22 18:21:21.149610517 +0000 UTC m=+52.766251658" Apr 22 18:21:23.421174 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:23.421137 2561 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:23.421174 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:23.421176 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:23.421599 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:23.421498 2561 scope.go:117] "RemoveContainer" containerID="f3dc82ab6ec2fff4814eb90b2a30988abc0cabe601a83aa45cfe41ac114e2a8f" Apr 22 18:21:23.421747 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:23.421728 2561 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-fk8pv_openshift-console-operator(d9adb985-4468-4c75-8d62-db92f367d26a)\"" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" podUID="d9adb985-4468-4c75-8d62-db92f367d26a" Apr 22 18:21:27.040253 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:27.040222 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tp5cv" Apr 22 18:21:33.233829 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.233780 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:33.234294 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.233960 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:33.234294 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.234014 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:33.234294 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.234080 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:33.234720 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.234697 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4280d58f-b305-45fb-a79c-389e20a9cd66-service-ca-bundle\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:33.236341 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.236315 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee93315f-9c9b-4049-924b-51b8b2c9e9dc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-g9jhc\" (UID: \"ee93315f-9c9b-4049-924b-51b8b2c9e9dc\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:33.236445 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.236371 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/10a6275f-3d55-41df-9ed8-7ff7d65b52cf-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pcct7\" (UID: \"10a6275f-3d55-41df-9ed8-7ff7d65b52cf\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:33.236487 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.236457 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4280d58f-b305-45fb-a79c-389e20a9cd66-metrics-certs\") pod \"router-default-65fc44b94d-k2qvj\" (UID: \"4280d58f-b305-45fb-a79c-389e20a9cd66\") " pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:33.335024 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.334973 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:33.335225 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.335073 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:33.335225 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.335093 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:33.337534 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.337483 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/319485bf-3dcb-4995-b853-56ed38442a76-cert\") pod \"ingress-canary-66ht9\" (UID: \"319485bf-3dcb-4995-b853-56ed38442a76\") " pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:33.337534 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.337500 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/44a332db-f3dc-4f80-a249-8ff0d0faa3ae-metrics-tls\") pod \"dns-default-294lm\" (UID: \"44a332db-f3dc-4f80-a249-8ff0d0faa3ae\") " pod="openshift-dns/dns-default-294lm" Apr 22 18:21:33.337534 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.337483 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/ed95f2bf-02e7-48fb-a9ea-047c98cd7939-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-vlb5l\" (UID: \"ed95f2bf-02e7-48fb-a9ea-047c98cd7939\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:33.438959 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.438927 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-r7d2d\"" Apr 22 18:21:33.446480 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.446456 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" Apr 22 18:21:33.471082 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.471058 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-sg8lg\"" Apr 22 18:21:33.478269 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.478241 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" Apr 22 18:21:33.478559 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.478542 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-czd76\"" Apr 22 18:21:33.486213 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.486086 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:33.502194 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.502170 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-9mqk2\"" Apr 22 18:21:33.507669 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.507448 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-dwl6t\"" Apr 22 18:21:33.509789 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.509766 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" Apr 22 18:21:33.516394 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.516224 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-66ht9" Apr 22 18:21:33.546634 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.546279 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-69x95\"" Apr 22 18:21:33.552985 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.552531 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-294lm" Apr 22 18:21:33.593165 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.593105 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7"] Apr 22 18:21:33.675694 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.675649 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc"] Apr 22 18:21:33.677040 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:33.677012 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee93315f_9c9b_4049_924b_51b8b2c9e9dc.slice/crio-81dd1971cda9531ddbd523585aef537d2b2229a87570784f1f95b807c7dd99f8 WatchSource:0}: Error finding container 81dd1971cda9531ddbd523585aef537d2b2229a87570784f1f95b807c7dd99f8: Status 404 returned error can't find the container with id 81dd1971cda9531ddbd523585aef537d2b2229a87570784f1f95b807c7dd99f8 Apr 22 18:21:33.718704 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.718633 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65fc44b94d-k2qvj"] Apr 22 18:21:33.747185 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.747163 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-294lm"] Apr 22 18:21:33.751613 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:33.751579 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a332db_f3dc_4f80_a249_8ff0d0faa3ae.slice/crio-7f543d7373bc7bc8ae2240223a84c7649709ade21b0faf7a18cb6d009cfab34f WatchSource:0}: Error finding container 7f543d7373bc7bc8ae2240223a84c7649709ade21b0faf7a18cb6d009cfab34f: Status 404 returned error can't find the container with id 7f543d7373bc7bc8ae2240223a84c7649709ade21b0faf7a18cb6d009cfab34f Apr 22 18:21:33.827348 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.827316 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8"] Apr 22 18:21:33.830344 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.830321 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:33.833499 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.833472 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:21:33.834376 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.834355 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:21:33.834476 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.834393 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:21:33.834476 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.834374 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-csfnc\"" Apr 22 18:21:33.834604 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.834439 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:21:33.834867 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.834832 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb"] Apr 22 18:21:33.838625 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.838605 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z"] Apr 22 18:21:33.838754 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.838739 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:33.841925 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.841903 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:21:33.843065 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.843048 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:33.845768 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.845750 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:21:33.845877 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.845811 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:21:33.845877 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.845858 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:21:33.845967 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.845754 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:21:33.848553 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.848506 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8"] Apr 22 18:21:33.855220 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.855200 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z"] Apr 22 18:21:33.868777 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.868754 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb"] Apr 22 18:21:33.937733 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.937708 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l"] Apr 22 18:21:33.938943 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.938910 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-66ht9"] Apr 22 18:21:33.939356 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939331 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8tfv\" (UniqueName: \"kubernetes.io/projected/3718780d-3335-42fe-85e7-78e4602c2d48-kube-api-access-s8tfv\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:33.939356 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:33.939337 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded95f2bf_02e7_48fb_a9ea_047c98cd7939.slice/crio-c4b64b85564f57eecd33112cae4d967853b3014398bdcb3b0e9745b060e508b3 WatchSource:0}: Error finding container c4b64b85564f57eecd33112cae4d967853b3014398bdcb3b0e9745b060e508b3: Status 404 returned error can't find the container with id c4b64b85564f57eecd33112cae4d967853b3014398bdcb3b0e9745b060e508b3 Apr 22 18:21:33.939479 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939396 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8418f0af-1749-422d-b00d-822602aa1396-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8\" (UID: \"8418f0af-1749-422d-b00d-822602aa1396\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:33.939479 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939429 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-268d2\" (UniqueName: \"kubernetes.io/projected/8418f0af-1749-422d-b00d-822602aa1396-kube-api-access-268d2\") pod \"managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8\" (UID: \"8418f0af-1749-422d-b00d-822602aa1396\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:33.939479 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939460 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7jd7\" (UniqueName: \"kubernetes.io/projected/eb07bc52-463d-4718-a745-ef6f79d23730-kube-api-access-b7jd7\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:33.939662 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939481 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-hub\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:33.939662 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939507 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-ca\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:33.939662 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939608 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:33.939662 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939653 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3718780d-3335-42fe-85e7-78e4602c2d48-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:33.939800 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939717 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eb07bc52-463d-4718-a745-ef6f79d23730-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:33.939847 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939780 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3718780d-3335-42fe-85e7-78e4602c2d48-tmp\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:33.939882 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.939844 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:33.940202 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:33.940180 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod319485bf_3dcb_4995_b853_56ed38442a76.slice/crio-bbbbe698da5a865249ab2b9ca07a58722edfe1236e9d807723826a83d4816f1f WatchSource:0}: Error finding container bbbbe698da5a865249ab2b9ca07a58722edfe1236e9d807723826a83d4816f1f: Status 404 returned error can't find the container with id bbbbe698da5a865249ab2b9ca07a58722edfe1236e9d807723826a83d4816f1f Apr 22 18:21:33.951622 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.951598 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tlr4j"] Apr 22 18:21:33.981318 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.981290 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tlr4j"] Apr 22 18:21:33.981463 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.981447 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:33.984331 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.984303 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:21:33.984463 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.984347 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:21:33.984606 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:33.984591 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-592td\"" Apr 22 18:21:34.041241 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041212 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5cd2fc57-e359-4295-9867-87062abcf7b9-data-volume\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.041385 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041257 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.041385 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041278 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5cd2fc57-e359-4295-9867-87062abcf7b9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.041385 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041332 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3718780d-3335-42fe-85e7-78e4602c2d48-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:34.041385 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041359 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492jb\" (UniqueName: \"kubernetes.io/projected/5cd2fc57-e359-4295-9867-87062abcf7b9-kube-api-access-492jb\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.041557 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041458 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5cd2fc57-e359-4295-9867-87062abcf7b9-crio-socket\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.041557 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041498 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eb07bc52-463d-4718-a745-ef6f79d23730-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.041648 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041561 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3718780d-3335-42fe-85e7-78e4602c2d48-tmp\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:34.041648 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041598 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.041751 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041657 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s8tfv\" (UniqueName: \"kubernetes.io/projected/3718780d-3335-42fe-85e7-78e4602c2d48-kube-api-access-s8tfv\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:34.041751 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041707 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8418f0af-1749-422d-b00d-822602aa1396-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8\" (UID: \"8418f0af-1749-422d-b00d-822602aa1396\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:34.041751 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041732 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-268d2\" (UniqueName: \"kubernetes.io/projected/8418f0af-1749-422d-b00d-822602aa1396-kube-api-access-268d2\") pod \"managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8\" (UID: \"8418f0af-1749-422d-b00d-822602aa1396\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:34.041904 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041763 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7jd7\" (UniqueName: \"kubernetes.io/projected/eb07bc52-463d-4718-a745-ef6f79d23730-kube-api-access-b7jd7\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.041904 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041790 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5cd2fc57-e359-4295-9867-87062abcf7b9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.041904 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041826 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-hub\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.041904 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041860 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-ca\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.042093 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.041936 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3718780d-3335-42fe-85e7-78e4602c2d48-tmp\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:34.042324 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.042297 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/eb07bc52-463d-4718-a745-ef6f79d23730-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.044271 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.044224 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.044400 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.044383 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-ca\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.044439 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.044400 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.044490 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.044470 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/eb07bc52-463d-4718-a745-ef6f79d23730-hub\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.044592 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.044481 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3718780d-3335-42fe-85e7-78e4602c2d48-klusterlet-config\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:34.044689 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.044651 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8418f0af-1749-422d-b00d-822602aa1396-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8\" (UID: \"8418f0af-1749-422d-b00d-822602aa1396\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:34.052041 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.052022 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8tfv\" (UniqueName: \"kubernetes.io/projected/3718780d-3335-42fe-85e7-78e4602c2d48-kube-api-access-s8tfv\") pod \"klusterlet-addon-workmgr-7cf6f479dc-fhhwb\" (UID: \"3718780d-3335-42fe-85e7-78e4602c2d48\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:34.052353 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.052338 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-268d2\" (UniqueName: \"kubernetes.io/projected/8418f0af-1749-422d-b00d-822602aa1396-kube-api-access-268d2\") pod \"managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8\" (UID: \"8418f0af-1749-422d-b00d-822602aa1396\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:34.054864 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.054843 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7jd7\" (UniqueName: \"kubernetes.io/projected/eb07bc52-463d-4718-a745-ef6f79d23730-kube-api-access-b7jd7\") pod \"cluster-proxy-proxy-agent-56c4d585df-k5r6z\" (UID: \"eb07bc52-463d-4718-a745-ef6f79d23730\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.142243 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.142207 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5cd2fc57-e359-4295-9867-87062abcf7b9-data-volume\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.142243 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.142247 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5cd2fc57-e359-4295-9867-87062abcf7b9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.142446 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.142279 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-492jb\" (UniqueName: \"kubernetes.io/projected/5cd2fc57-e359-4295-9867-87062abcf7b9-kube-api-access-492jb\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.142446 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.142309 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5cd2fc57-e359-4295-9867-87062abcf7b9-crio-socket\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.142446 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.142422 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5cd2fc57-e359-4295-9867-87062abcf7b9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.142577 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.142479 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5cd2fc57-e359-4295-9867-87062abcf7b9-crio-socket\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.142881 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.142861 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5cd2fc57-e359-4295-9867-87062abcf7b9-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.143345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.143327 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5cd2fc57-e359-4295-9867-87062abcf7b9-data-volume\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.144657 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.144639 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5cd2fc57-e359-4295-9867-87062abcf7b9-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.149638 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.149618 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" Apr 22 18:21:34.156345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.156326 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-492jb\" (UniqueName: \"kubernetes.io/projected/5cd2fc57-e359-4295-9867-87062abcf7b9-kube-api-access-492jb\") pod \"insights-runtime-extractor-tlr4j\" (UID: \"5cd2fc57-e359-4295-9867-87062abcf7b9\") " pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.159346 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.159318 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:34.164355 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.164337 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" Apr 22 18:21:34.167493 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.167466 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-294lm" event={"ID":"44a332db-f3dc-4f80-a249-8ff0d0faa3ae","Type":"ContainerStarted","Data":"7f543d7373bc7bc8ae2240223a84c7649709ade21b0faf7a18cb6d009cfab34f"} Apr 22 18:21:34.168794 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.168762 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" event={"ID":"ed95f2bf-02e7-48fb-a9ea-047c98cd7939","Type":"ContainerStarted","Data":"c4b64b85564f57eecd33112cae4d967853b3014398bdcb3b0e9745b060e508b3"} Apr 22 18:21:34.170419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.170368 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" event={"ID":"4280d58f-b305-45fb-a79c-389e20a9cd66","Type":"ContainerStarted","Data":"822c95ef14c1dc8a92096a9d93d095cb3818351b51b1c9d0fcea49c2ca6e207f"} Apr 22 18:21:34.170419 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.170399 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" event={"ID":"4280d58f-b305-45fb-a79c-389e20a9cd66","Type":"ContainerStarted","Data":"10c63edd00df45d49ec24a88ffbcff496f1e13aea07a23b948b54b99ea73ecd2"} Apr 22 18:21:34.171765 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.171741 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-66ht9" event={"ID":"319485bf-3dcb-4995-b853-56ed38442a76","Type":"ContainerStarted","Data":"bbbbe698da5a865249ab2b9ca07a58722edfe1236e9d807723826a83d4816f1f"} Apr 22 18:21:34.173104 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.173073 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" event={"ID":"ee93315f-9c9b-4049-924b-51b8b2c9e9dc","Type":"ContainerStarted","Data":"81dd1971cda9531ddbd523585aef537d2b2229a87570784f1f95b807c7dd99f8"} Apr 22 18:21:34.174329 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.174292 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" event={"ID":"10a6275f-3d55-41df-9ed8-7ff7d65b52cf","Type":"ContainerStarted","Data":"6f93f526c8574f2a855efa54214677d0228c44957e5cdd1e182f4bdf573fab3e"} Apr 22 18:21:34.199173 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.198333 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" podStartSLOduration=54.198316601 podStartE2EDuration="54.198316601s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:34.197465995 +0000 UTC m=+65.814107123" watchObservedRunningTime="2026-04-22 18:21:34.198316601 +0000 UTC m=+65.814957723" Apr 22 18:21:34.244162 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.244129 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:34.251284 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.251218 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"image-registry-65444dcf5-96pv2\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:34.304651 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.304565 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tlr4j" Apr 22 18:21:34.321610 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.321556 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8"] Apr 22 18:21:34.324985 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:34.324954 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8418f0af_1749_422d_b00d_822602aa1396.slice/crio-b3d818278605829d8b3d6ec40117dc620f213a68d210c2d52d0fcc28a1b4888f WatchSource:0}: Error finding container b3d818278605829d8b3d6ec40117dc620f213a68d210c2d52d0fcc28a1b4888f: Status 404 returned error can't find the container with id b3d818278605829d8b3d6ec40117dc620f213a68d210c2d52d0fcc28a1b4888f Apr 22 18:21:34.330996 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.330973 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-4bvf4\"" Apr 22 18:21:34.339128 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.339060 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:34.343192 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.343103 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z"] Apr 22 18:21:34.347109 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.347085 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb"] Apr 22 18:21:34.348166 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:34.348139 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb07bc52_463d_4718_a745_ef6f79d23730.slice/crio-c7c4e4fd31809c94f356f0e7077818f38aab7b1e764de5184573bbd274d56855 WatchSource:0}: Error finding container c7c4e4fd31809c94f356f0e7077818f38aab7b1e764de5184573bbd274d56855: Status 404 returned error can't find the container with id c7c4e4fd31809c94f356f0e7077818f38aab7b1e764de5184573bbd274d56855 Apr 22 18:21:34.351151 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:34.351121 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3718780d_3335_42fe_85e7_78e4602c2d48.slice/crio-e15f813699d8aac242b7d8540f5dbc39c02a13c69d74db9a5e91e6284152b6ab WatchSource:0}: Error finding container e15f813699d8aac242b7d8540f5dbc39c02a13c69d74db9a5e91e6284152b6ab: Status 404 returned error can't find the container with id e15f813699d8aac242b7d8540f5dbc39c02a13c69d74db9a5e91e6284152b6ab Apr 22 18:21:34.459791 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.459750 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tlr4j"] Apr 22 18:21:34.487903 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.487206 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:34.490114 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.489881 2561 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:34.512670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.512464 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-65444dcf5-96pv2"] Apr 22 18:21:34.521201 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:34.521171 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9564463_99d6_488c_ac26_bee01a2bbb0d.slice/crio-d458accbe6ee98c58cae87a1822973ee5088d276e918dac45305c6a4a4053db2 WatchSource:0}: Error finding container d458accbe6ee98c58cae87a1822973ee5088d276e918dac45305c6a4a4053db2: Status 404 returned error can't find the container with id d458accbe6ee98c58cae87a1822973ee5088d276e918dac45305c6a4a4053db2 Apr 22 18:21:34.647094 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.647054 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:21:34.650115 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.650086 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:21:34.665746 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.665716 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c854aae6-d913-46c5-9cec-ae4b5f6e8ff7-metrics-certs\") pod \"network-metrics-daemon-44prk\" (UID: \"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7\") " pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:21:34.725435 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.725403 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-bn55n\"" Apr 22 18:21:34.733131 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.733103 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-44prk" Apr 22 18:21:34.911180 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:34.911121 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-44prk"] Apr 22 18:21:34.912031 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:34.911990 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc854aae6_d913_46c5_9cec_ae4b5f6e8ff7.slice/crio-ac57453e10df44a42587c479b132f188ae9b169113e3b2ff383eb05f6e7ed5ce WatchSource:0}: Error finding container ac57453e10df44a42587c479b132f188ae9b169113e3b2ff383eb05f6e7ed5ce: Status 404 returned error can't find the container with id ac57453e10df44a42587c479b132f188ae9b169113e3b2ff383eb05f6e7ed5ce Apr 22 18:21:35.182914 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.182843 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" event={"ID":"3718780d-3335-42fe-85e7-78e4602c2d48","Type":"ContainerStarted","Data":"e15f813699d8aac242b7d8540f5dbc39c02a13c69d74db9a5e91e6284152b6ab"} Apr 22 18:21:35.186863 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.186782 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlr4j" event={"ID":"5cd2fc57-e359-4295-9867-87062abcf7b9","Type":"ContainerStarted","Data":"9b032c28ff45273df5f2e889181e6ead07d46879cf4e8a62351125988d5bcd3c"} Apr 22 18:21:35.186863 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.186822 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlr4j" event={"ID":"5cd2fc57-e359-4295-9867-87062abcf7b9","Type":"ContainerStarted","Data":"bef7e733d92127ddbd335c69461a1a1729ff3fe091349f6e15c9a4b0a8defcde"} Apr 22 18:21:35.188738 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.188699 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-44prk" event={"ID":"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7","Type":"ContainerStarted","Data":"ac57453e10df44a42587c479b132f188ae9b169113e3b2ff383eb05f6e7ed5ce"} Apr 22 18:21:35.192597 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.191666 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" event={"ID":"a9564463-99d6-488c-ac26-bee01a2bbb0d","Type":"ContainerStarted","Data":"95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad"} Apr 22 18:21:35.192597 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.191698 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" event={"ID":"a9564463-99d6-488c-ac26-bee01a2bbb0d","Type":"ContainerStarted","Data":"d458accbe6ee98c58cae87a1822973ee5088d276e918dac45305c6a4a4053db2"} Apr 22 18:21:35.192597 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.192560 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:35.197223 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.197194 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" event={"ID":"eb07bc52-463d-4718-a745-ef6f79d23730","Type":"ContainerStarted","Data":"c7c4e4fd31809c94f356f0e7077818f38aab7b1e764de5184573bbd274d56855"} Apr 22 18:21:35.200430 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.200366 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" event={"ID":"8418f0af-1749-422d-b00d-822602aa1396","Type":"ContainerStarted","Data":"b3d818278605829d8b3d6ec40117dc620f213a68d210c2d52d0fcc28a1b4888f"} Apr 22 18:21:35.200430 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.200404 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:35.202145 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.202109 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-65fc44b94d-k2qvj" Apr 22 18:21:35.237017 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:35.236960 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" podStartSLOduration=66.236939246 podStartE2EDuration="1m6.236939246s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:21:35.214609489 +0000 UTC m=+66.831250620" watchObservedRunningTime="2026-04-22 18:21:35.236939246 +0000 UTC m=+66.853580377" Apr 22 18:21:37.895427 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:37.895264 2561 scope.go:117] "RemoveContainer" containerID="f3dc82ab6ec2fff4814eb90b2a30988abc0cabe601a83aa45cfe41ac114e2a8f" Apr 22 18:21:41.085169 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:41.085140 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-sf25h" Apr 22 18:21:43.246143 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.246096 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-66ht9" event={"ID":"319485bf-3dcb-4995-b853-56ed38442a76","Type":"ContainerStarted","Data":"cfdb6d264c827901e076338d0e72efda7b19e257699a91e97b67b7c7fc3df28d"} Apr 22 18:21:43.258052 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.258022 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" event={"ID":"ee93315f-9c9b-4049-924b-51b8b2c9e9dc","Type":"ContainerStarted","Data":"9eb0e830fc126e6c3e8ab8d73ba130d0166ff38aff9d567d2f1862313c45fc64"} Apr 22 18:21:43.269700 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.269670 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" event={"ID":"10a6275f-3d55-41df-9ed8-7ff7d65b52cf","Type":"ContainerStarted","Data":"601561ddfc0453e701265773a8574a90f7c2d13a60a5aff8e4fe59abe84d4774"} Apr 22 18:21:43.269784 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.269712 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" event={"ID":"10a6275f-3d55-41df-9ed8-7ff7d65b52cf","Type":"ContainerStarted","Data":"37fbb73b8617ecd18a74b22f46c9a60b1f56bd93b80592492f511554ad0a80de"} Apr 22 18:21:43.270937 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.270889 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-66ht9" podStartSLOduration=33.509723383 podStartE2EDuration="42.27087363s" podCreationTimestamp="2026-04-22 18:21:01 +0000 UTC" firstStartedPulling="2026-04-22 18:21:33.941922516 +0000 UTC m=+65.558563623" lastFinishedPulling="2026-04-22 18:21:42.703072759 +0000 UTC m=+74.319713870" observedRunningTime="2026-04-22 18:21:43.270793531 +0000 UTC m=+74.887434660" watchObservedRunningTime="2026-04-22 18:21:43.27087363 +0000 UTC m=+74.887514759" Apr 22 18:21:43.272190 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.272162 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" event={"ID":"ed95f2bf-02e7-48fb-a9ea-047c98cd7939","Type":"ContainerStarted","Data":"46da097bed6753c4946269c93487a263c9c3eea96eb1fd717bb1940f817bdfe7"} Apr 22 18:21:43.276279 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.276116 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fk8pv_d9adb985-4468-4c75-8d62-db92f367d26a/console-operator/1.log" Apr 22 18:21:43.276279 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.276203 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" event={"ID":"d9adb985-4468-4c75-8d62-db92f367d26a","Type":"ContainerStarted","Data":"5a82978f8625fec666a7d4eb1f5e44a0c6a9541b96ab46ba54a5b0d27bf39b44"} Apr 22 18:21:43.284345 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.284316 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" event={"ID":"eb07bc52-463d-4718-a745-ef6f79d23730","Type":"ContainerStarted","Data":"35bcf5e1e7936d45e23d9f07feae62704d01ce2abe8e5f85b1f0cf3a862d1c1e"} Apr 22 18:21:43.286787 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.286764 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" event={"ID":"8418f0af-1749-422d-b00d-822602aa1396","Type":"ContainerStarted","Data":"fbfb9b126a9fd6cf0b8df2ab929fde3d11b5e940f99a740dc1589ba02b7b43dd"} Apr 22 18:21:43.294179 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.294127 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-g9jhc" podStartSLOduration=54.27030984 podStartE2EDuration="1m3.294110032s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:33.67925719 +0000 UTC m=+65.295898296" lastFinishedPulling="2026-04-22 18:21:42.703057381 +0000 UTC m=+74.319698488" observedRunningTime="2026-04-22 18:21:43.29252925 +0000 UTC m=+74.909170373" watchObservedRunningTime="2026-04-22 18:21:43.294110032 +0000 UTC m=+74.910751161" Apr 22 18:21:43.301676 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.301652 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" event={"ID":"3718780d-3335-42fe-85e7-78e4602c2d48","Type":"ContainerStarted","Data":"546affc1e4454e4fa570970a1801cc709d03f312d1feaba34ce0fd87038c93b7"} Apr 22 18:21:43.302397 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.302381 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:43.305109 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.305076 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" Apr 22 18:21:43.307410 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.307377 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-294lm" event={"ID":"44a332db-f3dc-4f80-a249-8ff0d0faa3ae","Type":"ContainerStarted","Data":"91d82b69cb313175ea4d776af9bd64e85187583ba801a23349ca08537e7cfac2"} Apr 22 18:21:43.310779 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.310740 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlr4j" event={"ID":"5cd2fc57-e359-4295-9867-87062abcf7b9","Type":"ContainerStarted","Data":"77d66bc80e184a7b4ecea2e12d8d99e9ac1bea287d3d42899e4d22986e0b7e09"} Apr 22 18:21:43.321218 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.321103 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pcct7" podStartSLOduration=54.294909117 podStartE2EDuration="1m3.321089581s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:33.677179577 +0000 UTC m=+65.293820693" lastFinishedPulling="2026-04-22 18:21:42.703360034 +0000 UTC m=+74.320001157" observedRunningTime="2026-04-22 18:21:43.320796227 +0000 UTC m=+74.937437349" watchObservedRunningTime="2026-04-22 18:21:43.321089581 +0000 UTC m=+74.937730709" Apr 22 18:21:43.352637 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.352529 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" podStartSLOduration=60.034088886 podStartE2EDuration="1m3.352495884s" podCreationTimestamp="2026-04-22 18:20:40 +0000 UTC" firstStartedPulling="2026-04-22 18:21:09.808675845 +0000 UTC m=+41.425316957" lastFinishedPulling="2026-04-22 18:21:13.127082848 +0000 UTC m=+44.743723955" observedRunningTime="2026-04-22 18:21:43.351273122 +0000 UTC m=+74.967914249" watchObservedRunningTime="2026-04-22 18:21:43.352495884 +0000 UTC m=+74.969137014" Apr 22 18:21:43.400666 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.399867 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5d576c7bcf-hzfl8" podStartSLOduration=1.954938018 podStartE2EDuration="10.399848615s" podCreationTimestamp="2026-04-22 18:21:33 +0000 UTC" firstStartedPulling="2026-04-22 18:21:34.327596554 +0000 UTC m=+65.944237666" lastFinishedPulling="2026-04-22 18:21:42.772507154 +0000 UTC m=+74.389148263" observedRunningTime="2026-04-22 18:21:43.395783539 +0000 UTC m=+75.012424670" watchObservedRunningTime="2026-04-22 18:21:43.399848615 +0000 UTC m=+75.016489745" Apr 22 18:21:43.433105 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.432332 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7cf6f479dc-fhhwb" podStartSLOduration=2.0185746 podStartE2EDuration="10.432314382s" podCreationTimestamp="2026-04-22 18:21:33 +0000 UTC" firstStartedPulling="2026-04-22 18:21:34.354480482 +0000 UTC m=+65.971121587" lastFinishedPulling="2026-04-22 18:21:42.768220247 +0000 UTC m=+74.384861369" observedRunningTime="2026-04-22 18:21:43.429826802 +0000 UTC m=+75.046467925" watchObservedRunningTime="2026-04-22 18:21:43.432314382 +0000 UTC m=+75.048955509" Apr 22 18:21:43.450852 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:43.450805 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-vlb5l" podStartSLOduration=51.947492396 podStartE2EDuration="58.450788862s" podCreationTimestamp="2026-04-22 18:20:45 +0000 UTC" firstStartedPulling="2026-04-22 18:21:33.941308589 +0000 UTC m=+65.557949699" lastFinishedPulling="2026-04-22 18:21:40.444605048 +0000 UTC m=+72.061246165" observedRunningTime="2026-04-22 18:21:43.448967681 +0000 UTC m=+75.065608813" watchObservedRunningTime="2026-04-22 18:21:43.450788862 +0000 UTC m=+75.067429989" Apr 22 18:21:44.316696 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:44.316637 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-294lm" event={"ID":"44a332db-f3dc-4f80-a249-8ff0d0faa3ae","Type":"ContainerStarted","Data":"d558f10243640a3847c63ede02b7d27ceecc9a3211912b139f9a43e8a37dcee7"} Apr 22 18:21:44.317136 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:44.316810 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-294lm" Apr 22 18:21:44.318991 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:44.318731 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-44prk" event={"ID":"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7","Type":"ContainerStarted","Data":"14c3c797ab5b89044f06401b77540cf3eb5917df6611ee5d355a9708ce33dc23"} Apr 22 18:21:44.318991 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:44.318764 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-44prk" event={"ID":"c854aae6-d913-46c5-9cec-ae4b5f6e8ff7","Type":"ContainerStarted","Data":"cd10e2a1e2c9e53c7e7245bc69ec93fd76b0ce77dbb7bacdb64ebb0332e9478f"} Apr 22 18:21:44.340481 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:44.340425 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-294lm" podStartSLOduration=34.412959994 podStartE2EDuration="43.340406207s" podCreationTimestamp="2026-04-22 18:21:01 +0000 UTC" firstStartedPulling="2026-04-22 18:21:33.753394899 +0000 UTC m=+65.370036020" lastFinishedPulling="2026-04-22 18:21:42.680841126 +0000 UTC m=+74.297482233" observedRunningTime="2026-04-22 18:21:44.33778457 +0000 UTC m=+75.954425723" watchObservedRunningTime="2026-04-22 18:21:44.340406207 +0000 UTC m=+75.957047337" Apr 22 18:21:44.356628 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:44.355878 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-44prk" podStartSLOduration=67.503837708 podStartE2EDuration="1m15.355860162s" podCreationTimestamp="2026-04-22 18:20:29 +0000 UTC" firstStartedPulling="2026-04-22 18:21:34.916253748 +0000 UTC m=+66.532894857" lastFinishedPulling="2026-04-22 18:21:42.7682762 +0000 UTC m=+74.384917311" observedRunningTime="2026-04-22 18:21:44.354646639 +0000 UTC m=+75.971287769" watchObservedRunningTime="2026-04-22 18:21:44.355860162 +0000 UTC m=+75.972501291" Apr 22 18:21:46.327040 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:46.326939 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" event={"ID":"eb07bc52-463d-4718-a745-ef6f79d23730","Type":"ContainerStarted","Data":"8bf43f41f3f3d8763e78b249d0c3d7c4fd259201a71253492480c7cf0995d4f4"} Apr 22 18:21:46.327040 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:46.326987 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" event={"ID":"eb07bc52-463d-4718-a745-ef6f79d23730","Type":"ContainerStarted","Data":"e9ecd122b62efa6fc41e58e8e74cf7e95b72ae5c65611ff256bc602b906215bf"} Apr 22 18:21:46.328715 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:46.328687 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlr4j" event={"ID":"5cd2fc57-e359-4295-9867-87062abcf7b9","Type":"ContainerStarted","Data":"2666e35c0507ea1a9a7a9c363f5f264f5f3b4001bda1f8424f78e2ca1bb7f3d0"} Apr 22 18:21:46.347394 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:46.347348 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-56c4d585df-k5r6z" podStartSLOduration=1.777405965 podStartE2EDuration="13.347336235s" podCreationTimestamp="2026-04-22 18:21:33 +0000 UTC" firstStartedPulling="2026-04-22 18:21:34.35326859 +0000 UTC m=+65.969909713" lastFinishedPulling="2026-04-22 18:21:45.923198871 +0000 UTC m=+77.539839983" observedRunningTime="2026-04-22 18:21:46.345610551 +0000 UTC m=+77.962251707" watchObservedRunningTime="2026-04-22 18:21:46.347336235 +0000 UTC m=+77.963977364" Apr 22 18:21:46.366087 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:46.366036 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tlr4j" podStartSLOduration=2.348743952 podStartE2EDuration="13.36602316s" podCreationTimestamp="2026-04-22 18:21:33 +0000 UTC" firstStartedPulling="2026-04-22 18:21:34.551985861 +0000 UTC m=+66.168626974" lastFinishedPulling="2026-04-22 18:21:45.569265076 +0000 UTC m=+77.185906182" observedRunningTime="2026-04-22 18:21:46.365381753 +0000 UTC m=+77.982022882" watchObservedRunningTime="2026-04-22 18:21:46.36602316 +0000 UTC m=+77.982664287" Apr 22 18:21:52.108129 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.108091 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gn972"] Apr 22 18:21:52.113455 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.113429 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.116667 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.116339 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:21:52.116667 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.116345 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:21:52.116667 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.116648 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:21:52.117938 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.117911 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ghp22\"" Apr 22 18:21:52.118246 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.118224 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:21:52.213596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213546 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-wtmp\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213596 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213599 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01c89f04-9e92-478e-92f8-ccde40c93b4e-metrics-client-ca\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213815 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213632 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213815 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213675 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-textfile\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213815 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213769 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-tls\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213815 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213795 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr6zn\" (UniqueName: \"kubernetes.io/projected/01c89f04-9e92-478e-92f8-ccde40c93b4e-kube-api-access-jr6zn\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213956 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213826 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-sys\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213956 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213909 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-root\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.213956 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.213940 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314363 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314326 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-tls\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314363 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314362 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr6zn\" (UniqueName: \"kubernetes.io/projected/01c89f04-9e92-478e-92f8-ccde40c93b4e-kube-api-access-jr6zn\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314382 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-sys\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314432 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-sys\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:52.314489 2561 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314499 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-root\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314561 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314570 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-root\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:21:52.314599 2561 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-tls podName:01c89f04-9e92-478e-92f8-ccde40c93b4e nodeName:}" failed. No retries permitted until 2026-04-22 18:21:52.814576942 +0000 UTC m=+84.431218061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-tls") pod "node-exporter-gn972" (UID: "01c89f04-9e92-478e-92f8-ccde40c93b4e") : secret "node-exporter-tls" not found Apr 22 18:21:52.314670 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314670 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-wtmp\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.315054 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314707 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01c89f04-9e92-478e-92f8-ccde40c93b4e-metrics-client-ca\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.315054 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314779 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.315054 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314832 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-textfile\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.315054 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.314851 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-wtmp\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.315239 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.315124 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-textfile\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.315239 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.315182 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/01c89f04-9e92-478e-92f8-ccde40c93b4e-metrics-client-ca\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.315239 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.315222 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-accelerators-collector-config\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.316994 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.316973 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.324794 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.324775 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr6zn\" (UniqueName: \"kubernetes.io/projected/01c89f04-9e92-478e-92f8-ccde40c93b4e-kube-api-access-jr6zn\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.819136 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.819091 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-tls\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:52.821412 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:52.821389 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/01c89f04-9e92-478e-92f8-ccde40c93b4e-node-exporter-tls\") pod \"node-exporter-gn972\" (UID: \"01c89f04-9e92-478e-92f8-ccde40c93b4e\") " pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:53.024944 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:53.024910 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gn972" Apr 22 18:21:53.036975 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:21:53.036940 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01c89f04_9e92_478e_92f8_ccde40c93b4e.slice/crio-bd298ef0722af0ed99c0602e63217d62c48f0efbc3096dbf88c8549d4cda2504 WatchSource:0}: Error finding container bd298ef0722af0ed99c0602e63217d62c48f0efbc3096dbf88c8549d4cda2504: Status 404 returned error can't find the container with id bd298ef0722af0ed99c0602e63217d62c48f0efbc3096dbf88c8549d4cda2504 Apr 22 18:21:53.298452 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:53.298418 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:53.307496 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:53.307469 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-fk8pv" Apr 22 18:21:53.351050 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:53.351003 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gn972" event={"ID":"01c89f04-9e92-478e-92f8-ccde40c93b4e","Type":"ContainerStarted","Data":"bd298ef0722af0ed99c0602e63217d62c48f0efbc3096dbf88c8549d4cda2504"} Apr 22 18:21:54.324898 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:54.324865 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-294lm" Apr 22 18:21:54.344019 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:54.343984 2561 patch_prober.go:28] interesting pod/image-registry-65444dcf5-96pv2 container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:21:54.344179 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:54.344042 2561 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" podUID="a9564463-99d6-488c-ac26-bee01a2bbb0d" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:21:55.358937 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:55.358901 2561 generic.go:358] "Generic (PLEG): container finished" podID="01c89f04-9e92-478e-92f8-ccde40c93b4e" containerID="8cd688bc62f4152fa721edf333a7c6f0828381415203edf5412a0207b5a2ab10" exitCode=0 Apr 22 18:21:55.359315 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:55.358986 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gn972" event={"ID":"01c89f04-9e92-478e-92f8-ccde40c93b4e","Type":"ContainerDied","Data":"8cd688bc62f4152fa721edf333a7c6f0828381415203edf5412a0207b5a2ab10"} Apr 22 18:21:56.020257 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:56.020217 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65444dcf5-96pv2"] Apr 22 18:21:56.024307 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:56.024283 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:21:56.363746 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:56.363656 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gn972" event={"ID":"01c89f04-9e92-478e-92f8-ccde40c93b4e","Type":"ContainerStarted","Data":"80e413be377c9c3cba64cb082f3ac2a00bc87096c317d6c09b2c07d934097a19"} Apr 22 18:21:56.363746 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:56.363690 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gn972" event={"ID":"01c89f04-9e92-478e-92f8-ccde40c93b4e","Type":"ContainerStarted","Data":"f873ba166a8a06a9a54f6515a606fa17dae126046e0e7da117baf46b22020948"} Apr 22 18:21:56.383391 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:21:56.383324 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gn972" podStartSLOduration=2.975093603 podStartE2EDuration="4.383309002s" podCreationTimestamp="2026-04-22 18:21:52 +0000 UTC" firstStartedPulling="2026-04-22 18:21:53.038843589 +0000 UTC m=+84.655484702" lastFinishedPulling="2026-04-22 18:21:54.447058985 +0000 UTC m=+86.063700101" observedRunningTime="2026-04-22 18:21:56.382384071 +0000 UTC m=+87.999025189" watchObservedRunningTime="2026-04-22 18:21:56.383309002 +0000 UTC m=+87.999950131" Apr 22 18:22:21.039599 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.039556 2561 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" podUID="a9564463-99d6-488c-ac26-bee01a2bbb0d" containerName="registry" containerID="cri-o://95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad" gracePeriod=30 Apr 22 18:22:21.284561 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.284534 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:22:21.329756 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329672 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.329756 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329708 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9564463-99d6-488c-ac26-bee01a2bbb0d-ca-trust-extracted\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.329756 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329736 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-trusted-ca\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.330029 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329768 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbzl2\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-kube-api-access-fbzl2\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.330029 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329794 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-bound-sa-token\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.330029 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329818 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-certificates\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.330029 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329879 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.330029 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.329915 2561 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") pod \"a9564463-99d6-488c-ac26-bee01a2bbb0d\" (UID: \"a9564463-99d6-488c-ac26-bee01a2bbb0d\") " Apr 22 18:22:21.330621 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.330268 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:22:21.330713 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.330608 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:22:21.332786 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.332732 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:22:21.332786 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.332749 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-kube-api-access-fbzl2" (OuterVolumeSpecName: "kube-api-access-fbzl2") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "kube-api-access-fbzl2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:22:21.332958 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.332799 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:22:21.332958 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.332899 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:22:21.333117 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.333098 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:22:21.338470 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.338442 2561 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9564463-99d6-488c-ac26-bee01a2bbb0d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a9564463-99d6-488c-ac26-bee01a2bbb0d" (UID: "a9564463-99d6-488c-ac26-bee01a2bbb0d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:22:21.431367 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431325 2561 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-trusted-ca\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.431367 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431362 2561 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fbzl2\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-kube-api-access-fbzl2\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.431367 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431377 2561 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-bound-sa-token\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.431656 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431389 2561 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-certificates\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.431656 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431404 2561 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-image-registry-private-configuration\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.431656 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431417 2561 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a9564463-99d6-488c-ac26-bee01a2bbb0d-registry-tls\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.431656 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431430 2561 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a9564463-99d6-488c-ac26-bee01a2bbb0d-installation-pull-secrets\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.431656 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.431442 2561 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a9564463-99d6-488c-ac26-bee01a2bbb0d-ca-trust-extracted\") on node \"ip-10-0-143-95.ec2.internal\" DevicePath \"\"" Apr 22 18:22:21.440601 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.440567 2561 generic.go:358] "Generic (PLEG): container finished" podID="a9564463-99d6-488c-ac26-bee01a2bbb0d" containerID="95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad" exitCode=0 Apr 22 18:22:21.440721 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.440620 2561 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" Apr 22 18:22:21.440721 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.440629 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" event={"ID":"a9564463-99d6-488c-ac26-bee01a2bbb0d","Type":"ContainerDied","Data":"95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad"} Apr 22 18:22:21.440721 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.440656 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-65444dcf5-96pv2" event={"ID":"a9564463-99d6-488c-ac26-bee01a2bbb0d","Type":"ContainerDied","Data":"d458accbe6ee98c58cae87a1822973ee5088d276e918dac45305c6a4a4053db2"} Apr 22 18:22:21.440721 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.440672 2561 scope.go:117] "RemoveContainer" containerID="95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad" Apr 22 18:22:21.449234 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.449213 2561 scope.go:117] "RemoveContainer" containerID="95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad" Apr 22 18:22:21.449494 ip-10-0-143-95 kubenswrapper[2561]: E0422 18:22:21.449471 2561 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad\": container with ID starting with 95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad not found: ID does not exist" containerID="95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad" Apr 22 18:22:21.449600 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.449505 2561 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad"} err="failed to get container status \"95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad\": rpc error: code = NotFound desc = could not find container \"95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad\": container with ID starting with 95f6edc0575080ebf0c35ebd011cf1e7cc89af6766da862c2403b694730e43ad not found: ID does not exist" Apr 22 18:22:21.463594 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.463574 2561 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-65444dcf5-96pv2"] Apr 22 18:22:21.469185 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:21.469166 2561 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-65444dcf5-96pv2"] Apr 22 18:22:22.898958 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:22.898913 2561 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9564463-99d6-488c-ac26-bee01a2bbb0d" path="/var/lib/kubelet/pods/a9564463-99d6-488c-ac26-bee01a2bbb0d/volumes" Apr 22 18:22:30.468336 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:30.468302 2561 generic.go:358] "Generic (PLEG): container finished" podID="a2acdde8-b24c-4983-b9f7-961b896d0102" containerID="3dc9a3d21cdf832c7da2e5d67588bec9ad1658cf5617b8ab75b114fc162d4bcb" exitCode=0 Apr 22 18:22:30.468682 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:30.468376 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" event={"ID":"a2acdde8-b24c-4983-b9f7-961b896d0102","Type":"ContainerDied","Data":"3dc9a3d21cdf832c7da2e5d67588bec9ad1658cf5617b8ab75b114fc162d4bcb"} Apr 22 18:22:30.468682 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:30.468675 2561 scope.go:117] "RemoveContainer" containerID="3dc9a3d21cdf832c7da2e5d67588bec9ad1658cf5617b8ab75b114fc162d4bcb" Apr 22 18:22:31.475168 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:31.475130 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-zrhmm" event={"ID":"a2acdde8-b24c-4983-b9f7-961b896d0102","Type":"ContainerStarted","Data":"b6014a54e49b6b4f9d425783ae48ab91d8ff2cefaabf7f80948bc2d581100f50"} Apr 22 18:22:31.476456 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:31.476434 2561 generic.go:358] "Generic (PLEG): container finished" podID="f6c11aaf-3f61-4ced-8377-07f284493875" containerID="dcfafe973e3c27acc3cb8ed0bf7bf7871e08d7052c28f9e230a437b61cc94cad" exitCode=0 Apr 22 18:22:31.476559 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:31.476501 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" event={"ID":"f6c11aaf-3f61-4ced-8377-07f284493875","Type":"ContainerDied","Data":"dcfafe973e3c27acc3cb8ed0bf7bf7871e08d7052c28f9e230a437b61cc94cad"} Apr 22 18:22:31.476790 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:31.476777 2561 scope.go:117] "RemoveContainer" containerID="dcfafe973e3c27acc3cb8ed0bf7bf7871e08d7052c28f9e230a437b61cc94cad" Apr 22 18:22:32.481249 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:32.481211 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-sbpmg" event={"ID":"f6c11aaf-3f61-4ced-8377-07f284493875","Type":"ContainerStarted","Data":"26c53ed5d861738a433b02c224c8a340de9d3a57e5b757471ad860b6f8a11e07"} Apr 22 18:22:36.495023 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:36.494988 2561 generic.go:358] "Generic (PLEG): container finished" podID="1876a404-9b16-4840-ba70-f6e585f28d86" containerID="3a81c2fbbf0de14266c0228bb29ff2e0af67c6089cd6358cab1f9a3aa799e3e0" exitCode=0 Apr 22 18:22:36.495406 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:36.495058 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2sbhl" event={"ID":"1876a404-9b16-4840-ba70-f6e585f28d86","Type":"ContainerDied","Data":"3a81c2fbbf0de14266c0228bb29ff2e0af67c6089cd6358cab1f9a3aa799e3e0"} Apr 22 18:22:36.495406 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:36.495397 2561 scope.go:117] "RemoveContainer" containerID="3a81c2fbbf0de14266c0228bb29ff2e0af67c6089cd6358cab1f9a3aa799e3e0" Apr 22 18:22:37.500176 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:22:37.500142 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-2sbhl" event={"ID":"1876a404-9b16-4840-ba70-f6e585f28d86","Type":"ContainerStarted","Data":"b6fcd8ac4920abceacca84478560aa9f21283377e005de0c4966439709d54a97"} Apr 22 18:24:27.132334 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:27.132307 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-crcs7_3c1d8261-0db3-4d2b-808a-e6bfde776154/global-pull-secret-syncer/0.log" Apr 22 18:24:27.132879 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:27.132861 2561 ???:1] "http: TLS handshake error from 10.0.143.95:60428: EOF" Apr 22 18:24:27.301423 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:27.301393 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9dcbx_5ae596f0-7dd8-44a5-bece-be47e95773a0/konnectivity-agent/0.log" Apr 22 18:24:27.421808 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:27.421734 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-143-95.ec2.internal_03ab3bb00fd17a2fe851fecedb91531c/haproxy/0.log" Apr 22 18:24:30.847739 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:30.847708 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-g9jhc_ee93315f-9c9b-4049-924b-51b8b2c9e9dc/cluster-monitoring-operator/0.log" Apr 22 18:24:31.080823 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:31.080796 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gn972_01c89f04-9e92-478e-92f8-ccde40c93b4e/node-exporter/0.log" Apr 22 18:24:31.101148 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:31.101080 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gn972_01c89f04-9e92-478e-92f8-ccde40c93b4e/kube-rbac-proxy/0.log" Apr 22 18:24:31.121944 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:31.121919 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gn972_01c89f04-9e92-478e-92f8-ccde40c93b4e/init-textfile/0.log" Apr 22 18:24:32.776234 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:32.776201 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-vlb5l_ed95f2bf-02e7-48fb-a9ea-047c98cd7939/networking-console-plugin/0.log" Apr 22 18:24:33.173334 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.173305 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fk8pv_d9adb985-4468-4c75-8d62-db92f367d26a/console-operator/1.log" Apr 22 18:24:33.177253 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.177229 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-fk8pv_d9adb985-4468-4c75-8d62-db92f367d26a/console-operator/2.log" Apr 22 18:24:33.656485 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.656454 2561 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527"] Apr 22 18:24:33.656766 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.656752 2561 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9564463-99d6-488c-ac26-bee01a2bbb0d" containerName="registry" Apr 22 18:24:33.656766 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.656766 2561 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9564463-99d6-488c-ac26-bee01a2bbb0d" containerName="registry" Apr 22 18:24:33.656888 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.656830 2561 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9564463-99d6-488c-ac26-bee01a2bbb0d" containerName="registry" Apr 22 18:24:33.659871 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.659855 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.662700 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.662676 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tgcct\"/\"openshift-service-ca.crt\"" Apr 22 18:24:33.662819 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.662687 2561 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tgcct\"/\"kube-root-ca.crt\"" Apr 22 18:24:33.663955 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.663939 2561 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tgcct\"/\"default-dockercfg-p5vpd\"" Apr 22 18:24:33.667187 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.667164 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527"] Apr 22 18:24:33.780163 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.780126 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-podres\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.780163 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.780166 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txc9p\" (UniqueName: \"kubernetes.io/projected/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-kube-api-access-txc9p\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.780665 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.780194 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-lib-modules\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.780665 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.780315 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-proc\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.780665 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.780362 2561 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-sys\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881466 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881423 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-proc\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881466 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881472 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-sys\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881696 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881544 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-podres\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881696 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881571 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-proc\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881696 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881660 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-podres\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881696 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881666 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txc9p\" (UniqueName: \"kubernetes.io/projected/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-kube-api-access-txc9p\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881818 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881711 2561 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-lib-modules\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881818 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881673 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-sys\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.881882 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.881828 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-lib-modules\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.889956 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.889933 2561 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txc9p\" (UniqueName: \"kubernetes.io/projected/d32c0f4e-fbe1-4dd5-87b3-e93bef809213-kube-api-access-txc9p\") pod \"perf-node-gather-daemonset-v8527\" (UID: \"d32c0f4e-fbe1-4dd5-87b3-e93bef809213\") " pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:33.945508 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.945431 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-q6bgs_3ade402b-25f9-4705-9e38-c812058fd982/volume-data-source-validator/0.log" Apr 22 18:24:33.970545 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:33.970500 2561 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:34.092547 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.092502 2561 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527"] Apr 22 18:24:34.095335 ip-10-0-143-95 kubenswrapper[2561]: W0422 18:24:34.095307 2561 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd32c0f4e_fbe1_4dd5_87b3_e93bef809213.slice/crio-fb96a30fca4e4c1d1284ea47094d286b3a5637caa3a9841dd8b8ed933f1eaa10 WatchSource:0}: Error finding container fb96a30fca4e4c1d1284ea47094d286b3a5637caa3a9841dd8b8ed933f1eaa10: Status 404 returned error can't find the container with id fb96a30fca4e4c1d1284ea47094d286b3a5637caa3a9841dd8b8ed933f1eaa10 Apr 22 18:24:34.572729 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.571799 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-294lm_44a332db-f3dc-4f80-a249-8ff0d0faa3ae/dns/0.log" Apr 22 18:24:34.591690 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.591659 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-294lm_44a332db-f3dc-4f80-a249-8ff0d0faa3ae/kube-rbac-proxy/0.log" Apr 22 18:24:34.746991 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.746964 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-mnvl5_543e99a2-8eb7-4ffb-b5ef-3e4ee83a0291/dns-node-resolver/0.log" Apr 22 18:24:34.846725 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.846637 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" event={"ID":"d32c0f4e-fbe1-4dd5-87b3-e93bef809213","Type":"ContainerStarted","Data":"9c51e8eb3d035ba7544f0180cf033e4ba0ae0f00a78e67c43673884c436c8a7a"} Apr 22 18:24:34.846725 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.846678 2561 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" event={"ID":"d32c0f4e-fbe1-4dd5-87b3-e93bef809213","Type":"ContainerStarted","Data":"fb96a30fca4e4c1d1284ea47094d286b3a5637caa3a9841dd8b8ed933f1eaa10"} Apr 22 18:24:34.847124 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.846796 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:34.862552 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:34.862494 2561 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" podStartSLOduration=1.862479114 podStartE2EDuration="1.862479114s" podCreationTimestamp="2026-04-22 18:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:24:34.862112539 +0000 UTC m=+246.478753668" watchObservedRunningTime="2026-04-22 18:24:34.862479114 +0000 UTC m=+246.479120271" Apr 22 18:24:35.281878 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:35.281851 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-wxxx2_c4067f04-ceb3-492b-a98c-80b9c869cc01/node-ca/0.log" Apr 22 18:24:36.045788 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:36.045758 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65fc44b94d-k2qvj_4280d58f-b305-45fb-a79c-389e20a9cd66/router/0.log" Apr 22 18:24:36.384599 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:36.384572 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-66ht9_319485bf-3dcb-4995-b853-56ed38442a76/serve-healthcheck-canary/0.log" Apr 22 18:24:36.820939 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:36.820852 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2sbhl_1876a404-9b16-4840-ba70-f6e585f28d86/insights-operator/1.log" Apr 22 18:24:36.821230 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:36.821208 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-2sbhl_1876a404-9b16-4840-ba70-f6e585f28d86/insights-operator/0.log" Apr 22 18:24:37.067355 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:37.067328 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tlr4j_5cd2fc57-e359-4295-9867-87062abcf7b9/kube-rbac-proxy/0.log" Apr 22 18:24:37.088491 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:37.088423 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tlr4j_5cd2fc57-e359-4295-9867-87062abcf7b9/exporter/0.log" Apr 22 18:24:37.109339 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:37.109300 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tlr4j_5cd2fc57-e359-4295-9867-87062abcf7b9/extractor/0.log" Apr 22 18:24:40.860225 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:40.860197 2561 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tgcct/perf-node-gather-daemonset-v8527" Apr 22 18:24:41.238861 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:41.238826 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gszhh_c90628b5-0064-4541-8207-f01d20c9be00/migrator/0.log" Apr 22 18:24:41.266865 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:41.266840 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-gszhh_c90628b5-0064-4541-8207-f01d20c9be00/graceful-termination/0.log" Apr 22 18:24:41.599201 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:41.599084 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-sbpmg_f6c11aaf-3f61-4ced-8377-07f284493875/kube-storage-version-migrator-operator/1.log" Apr 22 18:24:41.600029 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:41.599991 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-sbpmg_f6c11aaf-3f61-4ced-8377-07f284493875/kube-storage-version-migrator-operator/0.log" Apr 22 18:24:42.630018 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:42.629993 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cbzsc_f617d906-31ed-45b2-ad64-99d0315fed58/kube-multus-additional-cni-plugins/0.log" Apr 22 18:24:42.654082 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:42.654061 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cbzsc_f617d906-31ed-45b2-ad64-99d0315fed58/egress-router-binary-copy/0.log" Apr 22 18:24:42.677152 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:42.677127 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cbzsc_f617d906-31ed-45b2-ad64-99d0315fed58/cni-plugins/0.log" Apr 22 18:24:42.713748 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:42.713728 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cbzsc_f617d906-31ed-45b2-ad64-99d0315fed58/bond-cni-plugin/0.log" Apr 22 18:24:42.744348 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:42.744318 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cbzsc_f617d906-31ed-45b2-ad64-99d0315fed58/routeoverride-cni/0.log" Apr 22 18:24:42.776730 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:42.776700 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cbzsc_f617d906-31ed-45b2-ad64-99d0315fed58/whereabouts-cni-bincopy/0.log" Apr 22 18:24:42.802430 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:42.802402 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cbzsc_f617d906-31ed-45b2-ad64-99d0315fed58/whereabouts-cni/0.log" Apr 22 18:24:43.258381 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:43.258257 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lcrhm_9981a1fc-cec1-4187-82c7-f8a291c71356/kube-multus/0.log" Apr 22 18:24:43.315871 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:43.315837 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-44prk_c854aae6-d913-46c5-9cec-ae4b5f6e8ff7/network-metrics-daemon/0.log" Apr 22 18:24:43.336693 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:43.336666 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-44prk_c854aae6-d913-46c5-9cec-ae4b5f6e8ff7/kube-rbac-proxy/0.log" Apr 22 18:24:44.771853 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.771814 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-controller/0.log" Apr 22 18:24:44.798283 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.798246 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-acl-logging/0.log" Apr 22 18:24:44.799206 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.799188 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovn-acl-logging/1.log" Apr 22 18:24:44.820628 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.820607 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/kube-rbac-proxy-node/0.log" Apr 22 18:24:44.848418 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.848393 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 18:24:44.874894 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.874873 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/northd/0.log" Apr 22 18:24:44.896110 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.896089 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/nbdb/0.log" Apr 22 18:24:44.917627 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:44.917567 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/sbdb/0.log" Apr 22 18:24:45.002061 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:45.002035 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tp5cv_d74da43e-5f0d-4fd5-94fc-934129e8ccc0/ovnkube-controller/0.log" Apr 22 18:24:46.033194 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:46.033160 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-g4tf9_ede0fe22-46d7-48a6-93c9-92f84082afd4/check-endpoints/0.log" Apr 22 18:24:46.154561 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:46.154535 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-sf25h_8ad229f6-99cd-4eac-9f27-b8ae51b8bde3/network-check-target-container/0.log" Apr 22 18:24:47.211892 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:47.211866 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xs7wj_d5a8446b-4f32-4b50-b5eb-2657be43dc10/iptables-alerter/0.log" Apr 22 18:24:47.958352 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:47.958269 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-bd65z_c37bfcbd-6bd8-4a75-8c85-a5436d184894/tuned/0.log" Apr 22 18:24:49.603850 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:49.603822 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-pcct7_10a6275f-3d55-41df-9ed8-7ff7d65b52cf/cluster-samples-operator/0.log" Apr 22 18:24:49.618861 ip-10-0-143-95 kubenswrapper[2561]: I0422 18:24:49.618841 2561 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-pcct7_10a6275f-3d55-41df-9ed8-7ff7d65b52cf/cluster-samples-operator-watch/0.log"