Apr 20 20:03:21.124150 ip-10-0-141-130 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 20:03:21.124162 ip-10-0-141-130 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 20:03:21.124170 ip-10-0-141-130 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 20:03:21.124414 ip-10-0-141-130 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 20:03:31.177653 ip-10-0-141-130 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 20:03:31.177670 ip-10-0-141-130 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 96aa2cf6eead4826afa38af01236061e -- Apr 20 20:06:05.040223 ip-10-0-141-130 systemd[1]: Starting Kubernetes Kubelet... Apr 20 20:06:05.463612 ip-10-0-141-130 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:06:05.463612 ip-10-0-141-130 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 20:06:05.463612 ip-10-0-141-130 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:06:05.463612 ip-10-0-141-130 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 20:06:05.463612 ip-10-0-141-130 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 20:06:05.465084 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.464986 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 20:06:05.470624 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470598 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:06:05.470624 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470619 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:06:05.470624 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470623 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:06:05.470624 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470626 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:06:05.470624 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470629 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:06:05.470624 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470632 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:06:05.470624 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470635 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470638 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470641 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470643 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470646 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470649 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470652 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470654 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470657 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470660 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470663 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470666 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470668 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470671 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470674 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470676 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470679 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470681 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470684 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470687 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:06:05.470897 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470689 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470692 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470695 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470697 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470700 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470702 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470705 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470707 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470710 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470712 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470716 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470720 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470723 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470725 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470729 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470732 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470735 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470738 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470741 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470743 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:06:05.471377 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470745 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470748 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470751 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470753 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470755 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470758 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470761 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470763 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470765 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470768 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470772 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470774 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470776 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470779 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470781 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470784 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470786 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470789 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470791 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:06:05.471920 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470794 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470797 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470799 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470802 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470808 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470812 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470815 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470821 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470825 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470828 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470831 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470833 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470835 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470839 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470841 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470844 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470847 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470850 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470852 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470855 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:06:05.472383 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.470857 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471285 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471291 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471294 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471297 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471300 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471302 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471305 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471308 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471310 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471313 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471316 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471320 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471323 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471326 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471328 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471331 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471334 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471337 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471340 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:06:05.472925 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471342 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471345 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471348 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471351 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471353 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471357 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471359 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471362 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471366 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471371 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471374 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471377 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471380 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471383 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471386 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471389 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471392 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471395 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471398 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471400 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:06:05.473405 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471403 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471406 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471408 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471411 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471413 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471416 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471419 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471422 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471424 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471427 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471429 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471432 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471435 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471438 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471441 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471443 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471446 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471448 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471453 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:06:05.473922 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471457 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471459 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471462 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471464 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471467 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471469 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471472 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471474 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471477 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471479 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471482 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471484 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471487 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471489 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471492 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471495 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471497 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471500 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471502 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:06:05.474467 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471505 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471507 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471510 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471512 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471515 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471518 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471520 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471523 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.471525 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472724 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472736 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472744 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472748 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472754 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472757 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472762 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472767 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472770 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472773 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472776 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472780 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 20:06:05.474953 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472783 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472786 2579 flags.go:64] FLAG: --cgroup-root="" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472789 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472791 2579 flags.go:64] FLAG: --client-ca-file="" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472795 2579 flags.go:64] FLAG: --cloud-config="" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472797 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472800 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472804 2579 flags.go:64] FLAG: --cluster-domain="" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472808 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472812 2579 flags.go:64] FLAG: --config-dir="" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472814 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472818 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472822 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472825 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472828 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472831 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472834 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472837 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472840 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472843 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472846 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472850 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472853 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472856 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472859 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 20:06:05.475473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472862 2579 flags.go:64] FLAG: --enable-server="true" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472865 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472887 2579 flags.go:64] FLAG: --event-burst="100" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472893 2579 flags.go:64] FLAG: --event-qps="50" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472897 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472900 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472902 2579 flags.go:64] FLAG: --eviction-hard="" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472906 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472909 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472912 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472915 2579 flags.go:64] FLAG: --eviction-soft="" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472918 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472921 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472925 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472928 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472931 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472935 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472938 2579 flags.go:64] FLAG: --feature-gates="" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472942 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472945 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472948 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472952 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472955 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472958 2579 flags.go:64] FLAG: --help="false" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472961 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.476089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472964 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472967 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472970 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472973 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472976 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472979 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472982 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472985 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472988 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472991 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472994 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.472997 2579 flags.go:64] FLAG: --kube-reserved="" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473000 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473003 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473006 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473008 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473011 2579 flags.go:64] FLAG: --lock-file="" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473014 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473017 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473020 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473025 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473028 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473031 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 20:06:05.476693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473036 2579 flags.go:64] FLAG: --logging-format="text" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473039 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473042 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473046 2579 flags.go:64] FLAG: --manifest-url="" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473048 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473053 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473056 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473060 2579 flags.go:64] FLAG: --max-pods="110" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473063 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473066 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473069 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473071 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473074 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473077 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473080 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473088 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473091 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473093 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473097 2579 flags.go:64] FLAG: --pod-cidr="" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473100 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473106 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473109 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473112 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473115 2579 flags.go:64] FLAG: --port="10250" Apr 20 20:06:05.477286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473118 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473121 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ad388bc127ea7f07" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473124 2579 flags.go:64] FLAG: --qos-reserved="" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473127 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473130 2579 flags.go:64] FLAG: --register-node="true" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473133 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473135 2579 flags.go:64] FLAG: --register-with-taints="" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473139 2579 flags.go:64] FLAG: --registry-burst="10" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473143 2579 flags.go:64] FLAG: --registry-qps="5" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473146 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473149 2579 flags.go:64] FLAG: --reserved-memory="" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473153 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473156 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473159 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473162 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473165 2579 flags.go:64] FLAG: --runonce="false" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473167 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473170 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473173 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473176 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473178 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473181 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473184 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473187 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473190 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473193 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 20:06:05.477869 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473196 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473200 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473202 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473205 2579 flags.go:64] FLAG: --system-cgroups="" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473208 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473214 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473217 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473220 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473224 2579 flags.go:64] FLAG: --tls-min-version="" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473227 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473230 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473232 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473235 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473238 2579 flags.go:64] FLAG: --v="2" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473244 2579 flags.go:64] FLAG: --version="false" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473248 2579 flags.go:64] FLAG: --vmodule="" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473253 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473256 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473355 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473359 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473362 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473364 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473367 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473369 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:06:05.478504 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473372 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473374 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473377 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473379 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473382 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473384 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473386 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473389 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473392 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473394 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473398 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473400 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473403 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473405 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473408 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473411 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473413 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473416 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473418 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:06:05.479120 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473421 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473424 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473426 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473430 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473433 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473435 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473437 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473440 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473444 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473448 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473451 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473453 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473457 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473461 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473464 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473467 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473469 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473472 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473475 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:06:05.479640 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473477 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473480 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473483 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473486 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473489 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473492 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473495 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473497 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473500 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473502 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473505 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473507 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473509 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473512 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473514 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473517 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473524 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473526 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473529 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473531 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:06:05.480153 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473534 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473536 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473539 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473542 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473544 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473546 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473549 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473551 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473554 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473557 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473559 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473561 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473564 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473567 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473569 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473572 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473574 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473577 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473580 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473583 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:06:05.480682 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473585 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:06:05.481182 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.473587 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:06:05.481182 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.473596 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:06:05.481887 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.481850 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 20:06:05.481919 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.481891 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 20:06:05.481963 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481953 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:06:05.481963 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481963 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481966 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481970 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481973 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481975 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481978 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481982 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481985 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481988 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481991 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481993 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481996 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.481999 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482001 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482004 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482007 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482009 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482012 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482014 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:06:05.482023 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482017 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482020 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482023 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482025 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482028 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482031 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482034 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482036 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482039 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482042 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482045 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482048 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482051 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482053 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482056 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482059 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482061 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482063 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482066 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482068 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:06:05.482481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482071 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482073 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482076 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482078 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482081 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482083 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482086 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482088 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482091 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482093 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482097 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482099 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482102 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482107 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482112 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482115 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482118 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482121 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482123 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482126 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:06:05.483032 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482129 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482131 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482134 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482136 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482140 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482143 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482145 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482149 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482151 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482154 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482158 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482162 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482164 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482168 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482171 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482173 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482176 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482178 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482181 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:06:05.483528 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482183 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482186 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482189 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482192 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482195 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482197 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482201 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.482206 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482341 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482347 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482350 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482354 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482357 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482359 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482362 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 20:06:05.484012 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482365 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482368 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482371 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482374 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482377 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482380 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482382 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482385 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482387 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482390 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482392 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482395 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482397 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482400 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482402 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482405 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482408 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482410 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482412 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482415 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 20:06:05.484422 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482417 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482420 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482422 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482426 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482428 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482431 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482433 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482436 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482438 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482441 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482443 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482446 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482448 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482450 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482453 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482455 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482459 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482461 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482463 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482466 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 20:06:05.484937 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482469 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482471 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482473 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482476 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482478 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482481 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482483 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482486 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482488 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482491 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482495 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482499 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482502 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482505 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482507 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482511 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482514 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482518 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482521 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 20:06:05.485419 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482524 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482526 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482529 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482532 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482534 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482537 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482539 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482542 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482544 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482546 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482549 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482552 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482554 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482557 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482559 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482562 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482564 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482567 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482569 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 20:06:05.485968 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:05.482572 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 20:06:05.486442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.482576 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 20:06:05.486442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.483353 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 20:06:05.486442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.485567 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 20:06:05.486653 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.486642 2579 server.go:1019] "Starting client certificate rotation" Apr 20 20:06:05.486770 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.486749 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:06:05.486822 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.486802 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 20:06:05.509913 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.509864 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:06:05.512779 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.512743 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 20:06:05.526664 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.526640 2579 log.go:25] "Validated CRI v1 runtime API" Apr 20 20:06:05.532889 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.532854 2579 log.go:25] "Validated CRI v1 image API" Apr 20 20:06:05.535116 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.535096 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 20:06:05.540863 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.540835 2579 fs.go:135] Filesystem UUIDs: map[2567fe34-06ee-49b4-a1f1-c56bd2d5a37a:/dev/nvme0n1p3 6fabc6a0-f425-418a-95de-c9b288f4951f:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 20 20:06:05.540966 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.540861 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 20:06:05.545083 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.545066 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:06:05.546566 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.546449 2579 manager.go:217] Machine: {Timestamp:2026-04-20 20:06:05.544677798 +0000 UTC m=+0.392220830 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3088758 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28e9b55425780ab940ff182780098a SystemUUID:ec28e9b5-5425-780a-b940-ff182780098a BootID:96aa2cf6-eead-4826-afa3-8af01236061e Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8c:5f:7b:1c:b7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8c:5f:7b:1c:b7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fa:38:6e:f9:1e:a2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 20:06:05.546566 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.546561 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 20:06:05.546701 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.546688 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 20:06:05.547776 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.547744 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 20:06:05.547940 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.547778 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-130.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 20:06:05.547993 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.547952 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 20:06:05.547993 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.547961 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 20:06:05.547993 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.547975 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:06:05.548812 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.548799 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 20:06:05.550497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.550485 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:06:05.550608 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.550598 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 20:06:05.552706 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.552695 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 20 20:06:05.552761 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.552711 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 20:06:05.552761 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.552725 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 20:06:05.552761 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.552735 2579 kubelet.go:397] "Adding apiserver pod source" Apr 20 20:06:05.552761 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.552745 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 20:06:05.554093 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.554079 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:06:05.554151 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.554100 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 20:06:05.557263 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.557239 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 20:06:05.558834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.558817 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 20:06:05.560682 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560668 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560688 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560694 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560700 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560707 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560714 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560723 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560729 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560738 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 20:06:05.560744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560744 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 20:06:05.561017 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560753 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 20:06:05.561017 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.560762 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 20:06:05.561547 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.561536 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 20:06:05.561624 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.561549 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 20:06:05.565036 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.564976 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-141-130.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 20:06:05.565134 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.565028 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 20:06:05.565134 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.565028 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 20:06:05.565752 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.565741 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 20:06:05.565792 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.565780 2579 server.go:1295] "Started kubelet" Apr 20 20:06:05.565884 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.565844 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 20:06:05.566005 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.565912 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 20:06:05.566066 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.566034 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 20:06:05.566737 ip-10-0-141-130 systemd[1]: Started Kubernetes Kubelet. Apr 20 20:06:05.567551 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.567525 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 20:06:05.569124 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.569110 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 20 20:06:05.574461 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.574433 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lcxp9" Apr 20 20:06:05.575241 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.574377 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-130.ec2.internal.18a82964b87439dd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-130.ec2.internal,UID:ip-10-0-141-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-141-130.ec2.internal,},FirstTimestamp:2026-04-20 20:06:05.565753821 +0000 UTC m=+0.413296854,LastTimestamp:2026-04-20 20:06:05.565753821 +0000 UTC m=+0.413296854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-130.ec2.internal,}" Apr 20 20:06:05.575347 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.575257 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 20:06:05.575682 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.575669 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 20:06:05.576310 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.576295 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 20:06:05.577055 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577033 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 20:06:05.577131 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577060 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 20:06:05.577131 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577036 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 20:06:05.577131 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577128 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 20 20:06:05.577298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577138 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 20 20:06:05.577298 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.577205 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:05.577298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577254 2579 factory.go:153] Registering CRI-O factory Apr 20 20:06:05.577298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577269 2579 factory.go:223] Registration of the crio container factory successfully Apr 20 20:06:05.577484 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577314 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 20:06:05.577484 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577321 2579 factory.go:55] Registering systemd factory Apr 20 20:06:05.577484 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577327 2579 factory.go:223] Registration of the systemd container factory successfully Apr 20 20:06:05.577484 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577346 2579 factory.go:103] Registering Raw factory Apr 20 20:06:05.577484 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577355 2579 manager.go:1196] Started watching for new ooms in manager Apr 20 20:06:05.577744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.577726 2579 manager.go:319] Starting recovery of all containers Apr 20 20:06:05.579691 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.579661 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-141-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 20:06:05.579800 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.579779 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 20:06:05.584547 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.584359 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-lcxp9" Apr 20 20:06:05.588798 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.588780 2579 manager.go:324] Recovery completed Apr 20 20:06:05.593289 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.593276 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:06:05.596324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.596306 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:06:05.596422 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.596338 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:06:05.596422 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.596349 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:06:05.596909 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.596897 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 20:06:05.596963 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.596911 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 20:06:05.596963 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.596929 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 20 20:06:05.598909 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.598822 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-141-130.ec2.internal.18a82964ba46af2b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-141-130.ec2.internal,UID:ip-10-0-141-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-141-130.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-141-130.ec2.internal,},FirstTimestamp:2026-04-20 20:06:05.596323627 +0000 UTC m=+0.443866659,LastTimestamp:2026-04-20 20:06:05.596323627 +0000 UTC m=+0.443866659,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-141-130.ec2.internal,}" Apr 20 20:06:05.600089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.600073 2579 policy_none.go:49] "None policy: Start" Apr 20 20:06:05.600089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.600091 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 20:06:05.600225 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.600100 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 20 20:06:05.645582 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.645560 2579 manager.go:341] "Starting Device Plugin manager" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.645604 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.645618 2579 server.go:85] "Starting device plugin registration server" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.645923 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.645938 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.646029 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.646114 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.646123 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.646758 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 20:06:05.672847 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.646798 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:05.729596 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.729503 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 20:06:05.730949 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.730930 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 20:06:05.731020 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.730960 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 20:06:05.731020 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.730983 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 20:06:05.731020 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.730994 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 20:06:05.731135 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.731038 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 20:06:05.733528 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.733502 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:05.746620 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.746581 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:06:05.752908 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.752888 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:06:05.752991 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.752928 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:06:05.752991 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.752939 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:06:05.752991 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.752965 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.760823 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.760802 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.760904 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.760832 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-130.ec2.internal\": node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:05.779677 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.779644 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:05.831419 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.831367 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal"] Apr 20 20:06:05.831566 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.831467 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:06:05.832502 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.832485 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:06:05.832611 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.832516 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:06:05.832611 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.832531 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:06:05.834868 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.834854 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:06:05.835015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835001 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.835058 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835031 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:06:05.835704 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835688 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:06:05.835757 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835719 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:06:05.835757 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835729 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:06:05.835830 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835696 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:06:05.835830 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835789 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:06:05.835830 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.835802 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:06:05.837985 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.837967 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.838098 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.837994 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 20:06:05.838716 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.838701 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientMemory" Apr 20 20:06:05.838791 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.838727 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 20:06:05.838791 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.838736 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeHasSufficientPID" Apr 20 20:06:05.871695 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.871660 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-130.ec2.internal\" not found" node="ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.876122 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.876102 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-130.ec2.internal\" not found" node="ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.879231 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.879212 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7850bbe80921bbe4ecd705caf3e4d36-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal\" (UID: \"e7850bbe80921bbe4ecd705caf3e4d36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.879295 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.879240 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7850bbe80921bbe4ecd705caf3e4d36-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal\" (UID: \"e7850bbe80921bbe4ecd705caf3e4d36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.879295 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.879262 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d5ddc06d76dc03ab1fc5182830ef0d45-config\") pod \"kube-apiserver-proxy-ip-10-0-141-130.ec2.internal\" (UID: \"d5ddc06d76dc03ab1fc5182830ef0d45\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.880381 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.880363 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:05.980136 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.980048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7850bbe80921bbe4ecd705caf3e4d36-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal\" (UID: \"e7850bbe80921bbe4ecd705caf3e4d36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.980136 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.980083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7850bbe80921bbe4ecd705caf3e4d36-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal\" (UID: \"e7850bbe80921bbe4ecd705caf3e4d36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.980136 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.980101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d5ddc06d76dc03ab1fc5182830ef0d45-config\") pod \"kube-apiserver-proxy-ip-10-0-141-130.ec2.internal\" (UID: \"d5ddc06d76dc03ab1fc5182830ef0d45\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.980136 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.980133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7850bbe80921bbe4ecd705caf3e4d36-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal\" (UID: \"e7850bbe80921bbe4ecd705caf3e4d36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.980350 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.980145 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e7850bbe80921bbe4ecd705caf3e4d36-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal\" (UID: \"e7850bbe80921bbe4ecd705caf3e4d36\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.980350 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:05.980184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/d5ddc06d76dc03ab1fc5182830ef0d45-config\") pod \"kube-apiserver-proxy-ip-10-0-141-130.ec2.internal\" (UID: \"d5ddc06d76dc03ab1fc5182830ef0d45\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" Apr 20 20:06:05.980569 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:05.980556 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.080823 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.080775 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.174075 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.174046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:06.180181 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.180157 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" Apr 20 20:06:06.181238 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.181220 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.281832 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.281736 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.382256 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.382222 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.482558 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.482523 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.486683 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.486661 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 20:06:06.486849 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.486831 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 20:06:06.576404 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.576302 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 20:06:06.582998 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.582967 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.587420 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.587388 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 20:01:05 +0000 UTC" deadline="2027-10-27 21:53:37.446834811 +0000 UTC" Apr 20 20:06:06.587420 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.587418 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13321h47m30.85942099s" Apr 20 20:06:06.587586 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.587452 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 20:06:06.608593 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.608559 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-n6msc" Apr 20 20:06:06.616847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.616822 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-n6msc" Apr 20 20:06:06.683417 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.683383 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.759529 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.759496 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:06.783688 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.783654 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.863586 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:06.863551 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5ddc06d76dc03ab1fc5182830ef0d45.slice/crio-e60caf6eaf7c900812aae2d4cdfe65b1f416ad7aed9fa47e3e7b8f7f49124b48 WatchSource:0}: Error finding container e60caf6eaf7c900812aae2d4cdfe65b1f416ad7aed9fa47e3e7b8f7f49124b48: Status 404 returned error can't find the container with id e60caf6eaf7c900812aae2d4cdfe65b1f416ad7aed9fa47e3e7b8f7f49124b48 Apr 20 20:06:06.864029 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:06.864004 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7850bbe80921bbe4ecd705caf3e4d36.slice/crio-3e52c5ae538bc803724c9e662ddcfb68461aa8dcf4f3e82ea64448aea9bb0205 WatchSource:0}: Error finding container 3e52c5ae538bc803724c9e662ddcfb68461aa8dcf4f3e82ea64448aea9bb0205: Status 404 returned error can't find the container with id 3e52c5ae538bc803724c9e662ddcfb68461aa8dcf4f3e82ea64448aea9bb0205 Apr 20 20:06:06.868976 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.868960 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:06:06.884777 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.884748 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.985355 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:06.985317 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-130.ec2.internal\" not found" Apr 20 20:06:06.989630 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:06.989609 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:07.009804 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.009772 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:07.076939 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.076907 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" Apr 20 20:06:07.088984 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.088954 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:06:07.089815 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.089800 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" Apr 20 20:06:07.103857 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.103790 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 20:06:07.554110 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.554021 2579 apiserver.go:52] "Watching apiserver" Apr 20 20:06:07.562395 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.562365 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 20:06:07.562825 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.562788 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-7t5kf","openshift-multus/multus-7mjbn","openshift-multus/network-metrics-daemon-g7wqd","openshift-ovn-kubernetes/ovnkube-node-48d2t","kube-system/konnectivity-agent-t7s46","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg","openshift-cluster-node-tuning-operator/tuned-qktfm","openshift-dns/node-resolver-n52n7","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal","openshift-multus/multus-additional-cni-plugins-vrb5p","openshift-network-diagnostics/network-check-target-ncwtr","openshift-network-operator/iptables-alerter-nqld6","kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal"] Apr 20 20:06:07.565255 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.565231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.568129 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.567610 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.568129 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.567811 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.568129 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.567997 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-wq42t\"" Apr 20 20:06:07.572476 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.572454 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:07.572607 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.572542 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:07.575302 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.575276 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.577813 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.577579 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 20:06:07.577813 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.577666 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-bj44p\"" Apr 20 20:06:07.577813 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.577579 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.577813 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.577760 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.578203 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.577860 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.578203 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.578076 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 20:06:07.580023 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.579982 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 20:06:07.580151 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.580112 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 20:06:07.580217 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.580202 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-fqqdn\"" Apr 20 20:06:07.580342 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.580325 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.582671 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.582648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 20:06:07.582805 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.582744 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 20:06:07.583497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.582977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.583497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.583023 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.583497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.583090 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 20:06:07.583497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.583202 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.583497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.583287 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 20:06:07.583497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.583475 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9vxnk\"" Apr 20 20:06:07.585828 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.585804 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 20:06:07.586665 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.586520 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.586665 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.586538 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vxb9h\"" Apr 20 20:06:07.586665 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.586595 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.586665 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.586633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.586949 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.586853 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.588718 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncvp7\" (UniqueName: \"kubernetes.io/projected/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-kube-api-access-ncvp7\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.588845 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-ovn\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.588845 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-cni-bin\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.588845 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:07.588845 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588812 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-systemd-units\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589071 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588840 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-cnibin\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589071 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-netns\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589071 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588964 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-hostroot\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589071 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.588996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589071 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-systemd\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589071 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-system-cni-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589077 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-node-log\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589098 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-run\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589119 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-tmp\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-slash\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589167 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-sys\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-conf-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-etc-kubernetes\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589284 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovnkube-script-lib\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tq4v\" (UniqueName: \"kubernetes.io/projected/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-kube-api-access-6tq4v\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589348 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-multus-certs\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589386 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-log-socket\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-kubernetes\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg4bg\" (UniqueName: \"kubernetes.io/projected/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-kube-api-access-hg4bg\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589464 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysctl-d\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589489 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-systemd\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-host\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-modprobe-d\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-kubelet\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589584 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-cni-multus\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589614 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovnkube-config\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589651 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-os-release\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589687 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovn-node-metrics-cert\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.589729 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589710 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb961195-80bd-4743-b8f8-8b5ed4db814c-konnectivity-ca\") pod \"konnectivity-agent-t7s46\" (UID: \"bb961195-80bd-4743-b8f8-8b5ed4db814c\") " pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589739 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-var-lib-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589763 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-cni-netd\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysconfig\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysctl-conf\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-k8s-cni-cncf-io\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589855 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-cni-bin\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589899 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb961195-80bd-4743-b8f8-8b5ed4db814c-agent-certs\") pod \"konnectivity-agent-t7s46\" (UID: \"bb961195-80bd-4743-b8f8-8b5ed4db814c\") " pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589923 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-lib-modules\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.589941 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-tuned\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590012 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-run-netns\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590039 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-cni-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590065 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-cni-binary-copy\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-socket-dir-parent\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590128 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htqfp\" (UniqueName: \"kubernetes.io/projected/5020f6ce-7061-484f-9b7d-89141a36e42c-kube-api-access-htqfp\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.590425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-env-overrides\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-var-lib-kubelet\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590399 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-kubelet\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-daemon-config\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590495 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-etc-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590526 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590529 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590636 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590769 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590855 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.590998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-tpnm4\"" Apr 20 20:06:07.591161 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.591065 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8zqck\"" Apr 20 20:06:07.591866 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.591747 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.591866 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.591761 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:07.591866 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.591838 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:07.592752 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.592731 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.594194 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.594174 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.595180 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.595162 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 20:06:07.595307 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.595289 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-gbmxj\"" Apr 20 20:06:07.595435 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.595421 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 20:06:07.597238 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.597215 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:06:07.597472 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.597452 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-ffztq\"" Apr 20 20:06:07.597584 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.597562 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 20:06:07.597699 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.597591 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 20:06:07.619419 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.619389 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:01:06 +0000 UTC" deadline="2027-11-13 07:41:31.538662501 +0000 UTC" Apr 20 20:06:07.619419 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.619414 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13715h35m23.91925223s" Apr 20 20:06:07.678827 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.678767 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 20:06:07.691460 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tq4v\" (UniqueName: \"kubernetes.io/projected/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-kube-api-access-6tq4v\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.691460 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691455 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a545fd66-1350-4958-93ed-a017391ba8d7-serviceca\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-multus-certs\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-log-socket\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-kubernetes\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-multus-certs\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-kubernetes\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-log-socket\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691642 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-hosts-file\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.691686 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691685 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzql9\" (UniqueName: \"kubernetes.io/projected/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-kube-api-access-fzql9\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691712 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691738 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-etc-selinux\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hg4bg\" (UniqueName: \"kubernetes.io/projected/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-kube-api-access-hg4bg\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691792 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysctl-d\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691819 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-systemd\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.691893 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-systemd\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692043 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysctl-d\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-host\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692143 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-system-cni-dir\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.692179 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-modprobe-d\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-host\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-os-release\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692230 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cni-binary-copy\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-kubelet\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692321 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-cni-multus\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692349 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovnkube-config\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692362 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-modprobe-d\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692378 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cnibin\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692406 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-kubelet\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-cni-multus\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692480 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692508 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-os-release\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovn-node-metrics-cert\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb961195-80bd-4743-b8f8-8b5ed4db814c-konnectivity-ca\") pod \"konnectivity-agent-t7s46\" (UID: \"bb961195-80bd-4743-b8f8-8b5ed4db814c\") " pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692593 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-var-lib-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692595 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-os-release\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.692679 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692618 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-cni-netd\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysconfig\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysctl-conf\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg5vv\" (UniqueName: \"kubernetes.io/projected/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-kube-api-access-rg5vv\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692847 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysctl-conf\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692888 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692909 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-cni-netd\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692949 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-sysconfig\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.692989 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-var-lib-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693291 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-tmp-dir\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693323 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-iptables-alerter-script\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-k8s-cni-cncf-io\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-cni-bin\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb961195-80bd-4743-b8f8-8b5ed4db814c-agent-certs\") pod \"konnectivity-agent-t7s46\" (UID: \"bb961195-80bd-4743-b8f8-8b5ed4db814c\") " pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.693458 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-lib-modules\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-tuned\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693508 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-k8s-cni-cncf-io\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693462 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovnkube-config\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-host-slash\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/bb961195-80bd-4743-b8f8-8b5ed4db814c-konnectivity-ca\") pod \"konnectivity-agent-t7s46\" (UID: \"bb961195-80bd-4743-b8f8-8b5ed4db814c\") " pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693599 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-cni-bin\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693617 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-sys-fs\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-run-netns\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693806 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-lib-modules\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-cni-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693910 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-run-netns\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.694024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.693987 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-cni-binary-copy\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694101 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-socket-dir-parent\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-cni-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htqfp\" (UniqueName: \"kubernetes.io/projected/5020f6ce-7061-484f-9b7d-89141a36e42c-kube-api-access-htqfp\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694259 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694284 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-env-overrides\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-var-lib-kubelet\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vkm5\" (UniqueName: \"kubernetes.io/projected/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-kube-api-access-2vkm5\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694392 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-kubelet\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-daemon-config\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.694672 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-etc-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694487 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694837 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-socket-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694596 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-var-lib-kubelet\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-etc-openvswitch\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-var-lib-kubelet\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694769 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-socket-dir-parent\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694784 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-run-ovn-kubernetes\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ncvp7\" (UniqueName: \"kubernetes.io/projected/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-kube-api-access-ncvp7\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-cni-binary-copy\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.694713 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-env-overrides\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-ovn\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-cni-bin\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695086 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-device-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695101 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-daemon-config\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695117 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-run-ovn\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.695600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rl5c\" (UniqueName: \"kubernetes.io/projected/bcdae5fc-2620-4126-a962-272f1c67124b-kube-api-access-2rl5c\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695169 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-cni-bin\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695244 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-systemd-units\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-cnibin\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695299 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-netns\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-hostroot\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695375 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-systemd-units\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.695327 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-systemd\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-system-cni-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.695454 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:08.195433458 +0000 UTC m=+3.042976480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-system-cni-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-node-log\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-run\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-cnibin\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.696479 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695601 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-host-run-netns\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-tmp\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695637 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-systemd\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-hostroot\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695655 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695695 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a545fd66-1350-4958-93ed-a017391ba8d7-host\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695746 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-run\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695782 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-node-log\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695817 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jlld\" (UniqueName: \"kubernetes.io/projected/a545fd66-1350-4958-93ed-a017391ba8d7-kube-api-access-9jlld\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-slash\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695900 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-sys\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695927 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-registration-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695928 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-host-slash\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.695977 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-conf-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696015 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-etc-kubernetes\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696018 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-sys\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovnkube-script-lib\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.697324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696074 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-multus-conf-dir\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.698367 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696210 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-etc-kubernetes\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.698367 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696698 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovnkube-script-lib\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.698367 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696728 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-etc-tuned\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.698367 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-ovn-node-metrics-cert\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.698367 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.696826 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/bb961195-80bd-4743-b8f8-8b5ed4db814c-agent-certs\") pod \"konnectivity-agent-t7s46\" (UID: \"bb961195-80bd-4743-b8f8-8b5ed4db814c\") " pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.698743 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.698721 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-tmp\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.700473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.700449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg4bg\" (UniqueName: \"kubernetes.io/projected/1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac-kube-api-access-hg4bg\") pod \"ovnkube-node-48d2t\" (UID: \"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac\") " pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.702259 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.702227 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tq4v\" (UniqueName: \"kubernetes.io/projected/f38465e3-0b1f-43c3-9a2b-3f294c21b82a-kube-api-access-6tq4v\") pod \"tuned-qktfm\" (UID: \"f38465e3-0b1f-43c3-9a2b-3f294c21b82a\") " pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.703334 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.703298 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncvp7\" (UniqueName: \"kubernetes.io/projected/7307693c-15c5-4f2a-ac49-7a1626eb5a0d-kube-api-access-ncvp7\") pod \"multus-7mjbn\" (UID: \"7307693c-15c5-4f2a-ac49-7a1626eb5a0d\") " pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.704410 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.704380 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htqfp\" (UniqueName: \"kubernetes.io/projected/5020f6ce-7061-484f-9b7d-89141a36e42c-kube-api-access-htqfp\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:07.736355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.736293 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" event={"ID":"d5ddc06d76dc03ab1fc5182830ef0d45","Type":"ContainerStarted","Data":"e60caf6eaf7c900812aae2d4cdfe65b1f416ad7aed9fa47e3e7b8f7f49124b48"} Apr 20 20:06:07.737392 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.737361 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" event={"ID":"e7850bbe80921bbe4ecd705caf3e4d36","Type":"ContainerStarted","Data":"3e52c5ae538bc803724c9e662ddcfb68461aa8dcf4f3e82ea64448aea9bb0205"} Apr 20 20:06:07.797131 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vkm5\" (UniqueName: \"kubernetes.io/projected/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-kube-api-access-2vkm5\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.797312 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797153 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.797371 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797341 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-socket-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.797427 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-device-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.797471 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rl5c\" (UniqueName: \"kubernetes.io/projected/bcdae5fc-2620-4126-a962-272f1c67124b-kube-api-access-2rl5c\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.797471 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-socket-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.797471 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797343 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-tuning-conf-dir\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.797595 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-device-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.797595 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797545 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.797685 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797606 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a545fd66-1350-4958-93ed-a017391ba8d7-host\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.797685 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797634 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jlld\" (UniqueName: \"kubernetes.io/projected/a545fd66-1350-4958-93ed-a017391ba8d7-kube-api-access-9jlld\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.797685 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-registration-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.797819 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797685 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a545fd66-1350-4958-93ed-a017391ba8d7-serviceca\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.797819 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797704 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-hosts-file\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.797819 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzql9\" (UniqueName: \"kubernetes.io/projected/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-kube-api-access-fzql9\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.797819 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797755 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:07.797819 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-etc-selinux\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797849 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-system-cni-dir\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-os-release\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cni-binary-copy\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797945 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cnibin\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.797972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg5vv\" (UniqueName: \"kubernetes.io/projected/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-kube-api-access-rg5vv\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-tmp-dir\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-iptables-alerter-script\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.798085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798076 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-etc-selinux\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-host-slash\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798137 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-sys-fs\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798204 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-sys-fs\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-hosts-file\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798255 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-system-cni-dir\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798259 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/a545fd66-1350-4958-93ed-a017391ba8d7-serviceca\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798308 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-os-release\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798332 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-registration-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a545fd66-1350-4958-93ed-a017391ba8d7-host\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bcdae5fc-2620-4126-a962-272f1c67124b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.798503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cnibin\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.799119 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798617 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-tmp-dir\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.799119 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798675 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-host-slash\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.799119 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.798865 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cni-binary-copy\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.799119 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.799024 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.799303 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.799192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-iptables-alerter-script\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.804401 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.804016 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:07.804401 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.804043 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:07.804401 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.804058 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9hj65 for pod openshift-network-diagnostics/network-check-target-ncwtr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:07.804401 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:07.804159 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65 podName:f462fd2a-e95e-4bb6-b40a-76ad1cf91759 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:08.304118618 +0000 UTC m=+3.151661640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9hj65" (UniqueName: "kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65") pod "network-check-target-ncwtr" (UID: "f462fd2a-e95e-4bb6-b40a-76ad1cf91759") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:07.807088 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.807066 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzql9\" (UniqueName: \"kubernetes.io/projected/f4f29d83-69da-4bc7-a3ce-9bcc03a224ed-kube-api-access-fzql9\") pod \"node-resolver-n52n7\" (UID: \"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed\") " pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.807212 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.807162 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg5vv\" (UniqueName: \"kubernetes.io/projected/9774d5fe-006d-4f52-a1aa-6ab55dcf9946-kube-api-access-rg5vv\") pod \"multus-additional-cni-plugins-vrb5p\" (UID: \"9774d5fe-006d-4f52-a1aa-6ab55dcf9946\") " pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.807212 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.807175 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rl5c\" (UniqueName: \"kubernetes.io/projected/bcdae5fc-2620-4126-a962-272f1c67124b-kube-api-access-2rl5c\") pod \"aws-ebs-csi-driver-node-cpgcg\" (UID: \"bcdae5fc-2620-4126-a962-272f1c67124b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.807777 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.807747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vkm5\" (UniqueName: \"kubernetes.io/projected/0cc0ab62-60fe-454e-bf85-90ca6410b3d1-kube-api-access-2vkm5\") pod \"iptables-alerter-nqld6\" (UID: \"0cc0ab62-60fe-454e-bf85-90ca6410b3d1\") " pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.807892 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.807839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jlld\" (UniqueName: \"kubernetes.io/projected/a545fd66-1350-4958-93ed-a017391ba8d7-kube-api-access-9jlld\") pod \"node-ca-7t5kf\" (UID: \"a545fd66-1350-4958-93ed-a017391ba8d7\") " pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.880862 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.880813 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qktfm" Apr 20 20:06:07.894768 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.894736 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7mjbn" Apr 20 20:06:07.905778 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.905746 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:07.911550 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.911505 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:07.920393 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.920352 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" Apr 20 20:06:07.932281 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.932245 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7t5kf" Apr 20 20:06:07.940060 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.940032 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-n52n7" Apr 20 20:06:07.947760 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.947734 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" Apr 20 20:06:07.956503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.956473 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-nqld6" Apr 20 20:06:07.968974 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:07.968946 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 20:06:08.200943 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.200833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:08.201115 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:08.201014 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:08.201173 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:08.201117 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:09.201095676 +0000 UTC m=+4.048638707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:08.402065 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.402024 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:08.402217 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:08.402200 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:08.402277 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:08.402223 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:08.402277 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:08.402234 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9hj65 for pod openshift-network-diagnostics/network-check-target-ncwtr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:08.402375 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:08.402295 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65 podName:f462fd2a-e95e-4bb6-b40a-76ad1cf91759 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:09.402276483 +0000 UTC m=+4.249819511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-9hj65" (UniqueName: "kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65") pod "network-check-target-ncwtr" (UID: "f462fd2a-e95e-4bb6-b40a-76ad1cf91759") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:08.582779 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.582737 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb961195_80bd_4743_b8f8_8b5ed4db814c.slice/crio-88218b6832e202fb0a1aa32715bb2943e4a731d4ed4819e3e2a5b9f707304c07 WatchSource:0}: Error finding container 88218b6832e202fb0a1aa32715bb2943e4a731d4ed4819e3e2a5b9f707304c07: Status 404 returned error can't find the container with id 88218b6832e202fb0a1aa32715bb2943e4a731d4ed4819e3e2a5b9f707304c07 Apr 20 20:06:08.585617 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.585599 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e81f4c9_5ef3_4e1c_b95a_2cb9fe8e16ac.slice/crio-632da09477d53612221a9c2c72e52fd76db8f71e2eba1c8f2f6e724bbd68cf04 WatchSource:0}: Error finding container 632da09477d53612221a9c2c72e52fd76db8f71e2eba1c8f2f6e724bbd68cf04: Status 404 returned error can't find the container with id 632da09477d53612221a9c2c72e52fd76db8f71e2eba1c8f2f6e724bbd68cf04 Apr 20 20:06:08.587043 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.586976 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cc0ab62_60fe_454e_bf85_90ca6410b3d1.slice/crio-f2aaa17bd40a897f693c4efe62534802a54aa00c0b7b2eb2d5ef8043d5a67e37 WatchSource:0}: Error finding container f2aaa17bd40a897f693c4efe62534802a54aa00c0b7b2eb2d5ef8043d5a67e37: Status 404 returned error can't find the container with id f2aaa17bd40a897f693c4efe62534802a54aa00c0b7b2eb2d5ef8043d5a67e37 Apr 20 20:06:08.589396 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.589367 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7307693c_15c5_4f2a_ac49_7a1626eb5a0d.slice/crio-ad42f000fd9192d388ac5691fda9155ba91591ac862c8b52514917b702af1d16 WatchSource:0}: Error finding container ad42f000fd9192d388ac5691fda9155ba91591ac862c8b52514917b702af1d16: Status 404 returned error can't find the container with id ad42f000fd9192d388ac5691fda9155ba91591ac862c8b52514917b702af1d16 Apr 20 20:06:08.590388 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.590356 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcdae5fc_2620_4126_a962_272f1c67124b.slice/crio-9c3c809acffdef031d0b7e7d76f2d418b0d83f119338370b851426f954d4f310 WatchSource:0}: Error finding container 9c3c809acffdef031d0b7e7d76f2d418b0d83f119338370b851426f954d4f310: Status 404 returned error can't find the container with id 9c3c809acffdef031d0b7e7d76f2d418b0d83f119338370b851426f954d4f310 Apr 20 20:06:08.590822 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.590798 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf38465e3_0b1f_43c3_9a2b_3f294c21b82a.slice/crio-11a1531a3606a075151ec8528b356cf61451bde86d53f8bcb97fe6c18db87f8d WatchSource:0}: Error finding container 11a1531a3606a075151ec8528b356cf61451bde86d53f8bcb97fe6c18db87f8d: Status 404 returned error can't find the container with id 11a1531a3606a075151ec8528b356cf61451bde86d53f8bcb97fe6c18db87f8d Apr 20 20:06:08.591652 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.591594 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9774d5fe_006d_4f52_a1aa_6ab55dcf9946.slice/crio-36e5fefe449f29f01dd3db90960639119d8520e8fbceeea0261b80864102ce29 WatchSource:0}: Error finding container 36e5fefe449f29f01dd3db90960639119d8520e8fbceeea0261b80864102ce29: Status 404 returned error can't find the container with id 36e5fefe449f29f01dd3db90960639119d8520e8fbceeea0261b80864102ce29 Apr 20 20:06:08.592525 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.592500 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda545fd66_1350_4958_93ed_a017391ba8d7.slice/crio-c990d6981c6644e2c600b4d5d2e554a3b72ac4de851fdf31b5c737ba7186cedf WatchSource:0}: Error finding container c990d6981c6644e2c600b4d5d2e554a3b72ac4de851fdf31b5c737ba7186cedf: Status 404 returned error can't find the container with id c990d6981c6644e2c600b4d5d2e554a3b72ac4de851fdf31b5c737ba7186cedf Apr 20 20:06:08.595402 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:06:08.594269 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4f29d83_69da_4bc7_a3ce_9bcc03a224ed.slice/crio-13c02bed1960f02d12bf803303a08dd5a2f8b15448d5e3b8681b3a0b5808f474 WatchSource:0}: Error finding container 13c02bed1960f02d12bf803303a08dd5a2f8b15448d5e3b8681b3a0b5808f474: Status 404 returned error can't find the container with id 13c02bed1960f02d12bf803303a08dd5a2f8b15448d5e3b8681b3a0b5808f474 Apr 20 20:06:08.620609 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.620435 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 20:01:06 +0000 UTC" deadline="2027-12-10 18:27:53.711767267 +0000 UTC" Apr 20 20:06:08.620609 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.620607 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14374h21m45.091163765s" Apr 20 20:06:08.731503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.731467 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:08.731681 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:08.731588 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:08.740379 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.740346 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mjbn" event={"ID":"7307693c-15c5-4f2a-ac49-7a1626eb5a0d","Type":"ContainerStarted","Data":"ad42f000fd9192d388ac5691fda9155ba91591ac862c8b52514917b702af1d16"} Apr 20 20:06:08.741339 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.741313 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"632da09477d53612221a9c2c72e52fd76db8f71e2eba1c8f2f6e724bbd68cf04"} Apr 20 20:06:08.742278 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.742251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t7s46" event={"ID":"bb961195-80bd-4743-b8f8-8b5ed4db814c","Type":"ContainerStarted","Data":"88218b6832e202fb0a1aa32715bb2943e4a731d4ed4819e3e2a5b9f707304c07"} Apr 20 20:06:08.743721 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.743697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" event={"ID":"d5ddc06d76dc03ab1fc5182830ef0d45","Type":"ContainerStarted","Data":"3bc69fe868e5f69219808d965f4e860b051b4ad6635d49a0a8183a08e3f4f3aa"} Apr 20 20:06:08.744738 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.744719 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7t5kf" event={"ID":"a545fd66-1350-4958-93ed-a017391ba8d7","Type":"ContainerStarted","Data":"c990d6981c6644e2c600b4d5d2e554a3b72ac4de851fdf31b5c737ba7186cedf"} Apr 20 20:06:08.745681 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.745660 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerStarted","Data":"36e5fefe449f29f01dd3db90960639119d8520e8fbceeea0261b80864102ce29"} Apr 20 20:06:08.746581 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.746562 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nqld6" event={"ID":"0cc0ab62-60fe-454e-bf85-90ca6410b3d1","Type":"ContainerStarted","Data":"f2aaa17bd40a897f693c4efe62534802a54aa00c0b7b2eb2d5ef8043d5a67e37"} Apr 20 20:06:08.747598 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.747575 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n52n7" event={"ID":"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed","Type":"ContainerStarted","Data":"13c02bed1960f02d12bf803303a08dd5a2f8b15448d5e3b8681b3a0b5808f474"} Apr 20 20:06:08.748610 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.748592 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qktfm" event={"ID":"f38465e3-0b1f-43c3-9a2b-3f294c21b82a","Type":"ContainerStarted","Data":"11a1531a3606a075151ec8528b356cf61451bde86d53f8bcb97fe6c18db87f8d"} Apr 20 20:06:08.749547 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.749521 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" event={"ID":"bcdae5fc-2620-4126-a962-272f1c67124b","Type":"ContainerStarted","Data":"9c3c809acffdef031d0b7e7d76f2d418b0d83f119338370b851426f954d4f310"} Apr 20 20:06:08.755731 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:08.755682 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-130.ec2.internal" podStartSLOduration=1.755669282 podStartE2EDuration="1.755669282s" podCreationTimestamp="2026-04-20 20:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:06:08.755313035 +0000 UTC m=+3.602856089" watchObservedRunningTime="2026-04-20 20:06:08.755669282 +0000 UTC m=+3.603212322" Apr 20 20:06:09.209508 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:09.209466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:09.209688 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:09.209655 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:09.209769 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:09.209725 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:11.209703397 +0000 UTC m=+6.057246429 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:09.411307 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:09.411249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:09.411469 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:09.411436 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:09.411469 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:09.411459 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:09.411583 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:09.411472 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9hj65 for pod openshift-network-diagnostics/network-check-target-ncwtr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:09.411583 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:09.411536 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65 podName:f462fd2a-e95e-4bb6-b40a-76ad1cf91759 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:11.411517334 +0000 UTC m=+6.259060369 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-9hj65" (UniqueName: "kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65") pod "network-check-target-ncwtr" (UID: "f462fd2a-e95e-4bb6-b40a-76ad1cf91759") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:09.734783 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:09.734266 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:09.734783 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:09.734404 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:09.757164 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:09.757126 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" event={"ID":"e7850bbe80921bbe4ecd705caf3e4d36","Type":"ContainerStarted","Data":"26d59776200193bc1c7956e1e3aed42c86043148e5278a7e5c00d3b123cd5352"} Apr 20 20:06:10.731380 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:10.731344 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:10.731578 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:10.731490 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:10.778284 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:10.778245 2579 generic.go:358] "Generic (PLEG): container finished" podID="e7850bbe80921bbe4ecd705caf3e4d36" containerID="26d59776200193bc1c7956e1e3aed42c86043148e5278a7e5c00d3b123cd5352" exitCode=0 Apr 20 20:06:10.778708 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:10.778313 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" event={"ID":"e7850bbe80921bbe4ecd705caf3e4d36","Type":"ContainerDied","Data":"26d59776200193bc1c7956e1e3aed42c86043148e5278a7e5c00d3b123cd5352"} Apr 20 20:06:11.226816 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:11.226723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:11.227010 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:11.226908 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:11.227010 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:11.226973 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:15.226954973 +0000 UTC m=+10.074498010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:11.428787 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:11.428714 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:11.428966 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:11.428940 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:11.428966 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:11.428962 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:11.429087 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:11.428974 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9hj65 for pod openshift-network-diagnostics/network-check-target-ncwtr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:11.429087 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:11.429034 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65 podName:f462fd2a-e95e-4bb6-b40a-76ad1cf91759 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:15.429015359 +0000 UTC m=+10.276558393 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-9hj65" (UniqueName: "kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65") pod "network-check-target-ncwtr" (UID: "f462fd2a-e95e-4bb6-b40a-76ad1cf91759") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:11.731760 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:11.731716 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:11.732041 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:11.732016 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:12.731692 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:12.731632 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:12.732099 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:12.731766 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:13.731446 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:13.731402 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:13.731636 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:13.731558 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:14.731196 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:14.731161 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:14.731695 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:14.731310 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:15.260149 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:15.259525 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:15.260149 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:15.259695 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:15.260149 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:15.259756 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:23.259738664 +0000 UTC m=+18.107281688 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:15.461430 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:15.461385 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:15.461584 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:15.461573 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:15.461625 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:15.461589 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:15.461625 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:15.461599 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9hj65 for pod openshift-network-diagnostics/network-check-target-ncwtr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:15.461689 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:15.461654 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65 podName:f462fd2a-e95e-4bb6-b40a-76ad1cf91759 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:23.461636529 +0000 UTC m=+18.309179570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-9hj65" (UniqueName: "kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65") pod "network-check-target-ncwtr" (UID: "f462fd2a-e95e-4bb6-b40a-76ad1cf91759") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:15.733667 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:15.733640 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:15.734111 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:15.733749 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:16.731576 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:16.731533 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:16.731775 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:16.731669 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:17.732256 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:17.732217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:17.732700 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:17.732366 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:18.731398 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:18.731360 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:18.731581 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:18.731502 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:19.734746 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:19.734714 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:19.735234 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:19.734836 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:20.731818 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:20.731783 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:20.732009 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:20.731913 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:21.731631 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:21.731532 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:21.732105 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:21.731719 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:22.731505 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:22.731467 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:22.731673 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:22.731588 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:23.322164 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:23.322120 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:23.322353 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:23.322292 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:23.322405 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:23.322376 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:06:39.322355889 +0000 UTC m=+34.169898909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:23.524181 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:23.524145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:23.524369 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:23.524286 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:23.524369 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:23.524309 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:23.524369 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:23.524324 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9hj65 for pod openshift-network-diagnostics/network-check-target-ncwtr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:23.524546 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:23.524388 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65 podName:f462fd2a-e95e-4bb6-b40a-76ad1cf91759 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:39.52437376 +0000 UTC m=+34.371916786 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-9hj65" (UniqueName: "kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65") pod "network-check-target-ncwtr" (UID: "f462fd2a-e95e-4bb6-b40a-76ad1cf91759") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:23.731682 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:23.731589 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:23.732150 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:23.731734 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:24.732027 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:24.731991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:24.732450 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:24.732110 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:25.732954 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:25.732916 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:25.733384 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:25.733096 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:26.731577 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.731374 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:26.731702 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:26.731661 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:26.810482 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.810296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7t5kf" event={"ID":"a545fd66-1350-4958-93ed-a017391ba8d7","Type":"ContainerStarted","Data":"731d962858ea5491b2106b0e023582bdd00825e23fd921c457a626b987cf101b"} Apr 20 20:06:26.811835 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.811810 2579 generic.go:358] "Generic (PLEG): container finished" podID="9774d5fe-006d-4f52-a1aa-6ab55dcf9946" containerID="bf13d279aed02105fd96e054aa37b1b62909ad620c829cb6081a88030f30b48a" exitCode=0 Apr 20 20:06:26.811947 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.811893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerDied","Data":"bf13d279aed02105fd96e054aa37b1b62909ad620c829cb6081a88030f30b48a"} Apr 20 20:06:26.816520 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.816447 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" event={"ID":"e7850bbe80921bbe4ecd705caf3e4d36","Type":"ContainerStarted","Data":"ad0fdc9b862944a17441f8f76a89f79edfce2f27b75fe4e776c7c80f42ecb0d4"} Apr 20 20:06:26.818324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.818296 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-n52n7" event={"ID":"f4f29d83-69da-4bc7-a3ce-9bcc03a224ed","Type":"ContainerStarted","Data":"3ae44f62487d4ef7a76c58344653cc01bf739722904359491f863b56a8811db5"} Apr 20 20:06:26.823239 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.823202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qktfm" event={"ID":"f38465e3-0b1f-43c3-9a2b-3f294c21b82a","Type":"ContainerStarted","Data":"30aa1b3144b5f9cfe1b35c374d6117d545d42d71bba40471aa2cc2cbdeb4e140"} Apr 20 20:06:26.824438 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.824398 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7t5kf" podStartSLOduration=4.271560583 podStartE2EDuration="21.824385217s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.598039199 +0000 UTC m=+3.445582234" lastFinishedPulling="2026-04-20 20:06:26.150863836 +0000 UTC m=+20.998406868" observedRunningTime="2026-04-20 20:06:26.823788098 +0000 UTC m=+21.671331140" watchObservedRunningTime="2026-04-20 20:06:26.824385217 +0000 UTC m=+21.671928458" Apr 20 20:06:26.824954 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.824934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" event={"ID":"bcdae5fc-2620-4126-a962-272f1c67124b","Type":"ContainerStarted","Data":"75f485bffb2e801cf09196984d727d84a1c8fcf27f9b9b326c40e6ca137fd54d"} Apr 20 20:06:26.826311 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.826196 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7mjbn" event={"ID":"7307693c-15c5-4f2a-ac49-7a1626eb5a0d","Type":"ContainerStarted","Data":"72b080d5376ddea05582e7d90d443c8ee1ccc1939b9d3b4a6c4ea0d4e8a820e4"} Apr 20 20:06:26.827829 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.827813 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:06:26.828178 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.828158 2579 generic.go:358] "Generic (PLEG): container finished" podID="1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac" containerID="336c72248edb053637aceab2e542ca082250a42613491ebeecb407a52101526d" exitCode=1 Apr 20 20:06:26.828228 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.828219 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"459884e7a9c13b8c4b2ff6594d5fb44b1b792a1da910eecd9e15b9b4fc4f2b25"} Apr 20 20:06:26.828277 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.828240 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerDied","Data":"336c72248edb053637aceab2e542ca082250a42613491ebeecb407a52101526d"} Apr 20 20:06:26.828277 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.828254 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"3f48501c69d8f55031fb2e8dcd44a385c6b4ee1edef1420f48b093c543d407ad"} Apr 20 20:06:26.829614 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.829591 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-t7s46" event={"ID":"bb961195-80bd-4743-b8f8-8b5ed4db814c","Type":"ContainerStarted","Data":"bceb054e7d49ab241f75ea1d3c95c71b18b22772ad99d497a9f9ded5bba2e4e1"} Apr 20 20:06:26.879197 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.879156 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qktfm" podStartSLOduration=4.273283226 podStartE2EDuration="21.87914068s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.592758221 +0000 UTC m=+3.440301253" lastFinishedPulling="2026-04-20 20:06:26.198615673 +0000 UTC m=+21.046158707" observedRunningTime="2026-04-20 20:06:26.864255142 +0000 UTC m=+21.711798183" watchObservedRunningTime="2026-04-20 20:06:26.87914068 +0000 UTC m=+21.726683725" Apr 20 20:06:26.879297 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.879280 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-130.ec2.internal" podStartSLOduration=19.879276001 podStartE2EDuration="19.879276001s" podCreationTimestamp="2026-04-20 20:06:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:06:26.878745375 +0000 UTC m=+21.726288416" watchObservedRunningTime="2026-04-20 20:06:26.879276001 +0000 UTC m=+21.726819041" Apr 20 20:06:26.892727 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.892597 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-n52n7" podStartSLOduration=4.315719548 podStartE2EDuration="21.892579666s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.59814941 +0000 UTC m=+3.445692428" lastFinishedPulling="2026-04-20 20:06:26.175009512 +0000 UTC m=+21.022552546" observedRunningTime="2026-04-20 20:06:26.892396371 +0000 UTC m=+21.739939403" watchObservedRunningTime="2026-04-20 20:06:26.892579666 +0000 UTC m=+21.740122708" Apr 20 20:06:26.911039 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.910985 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7mjbn" podStartSLOduration=4.28538329 podStartE2EDuration="21.91097062s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.591395844 +0000 UTC m=+3.438938878" lastFinishedPulling="2026-04-20 20:06:26.216983175 +0000 UTC m=+21.064526208" observedRunningTime="2026-04-20 20:06:26.910774837 +0000 UTC m=+21.758317879" watchObservedRunningTime="2026-04-20 20:06:26.91097062 +0000 UTC m=+21.758513660" Apr 20 20:06:26.927777 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:26.927731 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-t7s46" podStartSLOduration=4.337347752 podStartE2EDuration="21.927714527s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.584124278 +0000 UTC m=+3.431667302" lastFinishedPulling="2026-04-20 20:06:26.174491051 +0000 UTC m=+21.022034077" observedRunningTime="2026-04-20 20:06:26.927266185 +0000 UTC m=+21.774809226" watchObservedRunningTime="2026-04-20 20:06:26.927714527 +0000 UTC m=+21.775257567" Apr 20 20:06:27.732027 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:27.731991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:27.732253 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:27.732150 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:27.834705 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:27.834679 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:06:27.835129 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:27.835090 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"88563397d17619b602a0371d5cc9acfd358822fa6cfa89fb49260ab8ed41f528"} Apr 20 20:06:27.835178 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:27.835128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"9caa4afd43d7cad94669cbab22bb56e59c891d8ca8d7a0e684fd262ea32dc6ed"} Apr 20 20:06:27.835178 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:27.835146 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"a609625eaf115efde2763bc3595a92f5e86e842586339bbb21e2b74afa939665"} Apr 20 20:06:27.836609 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:27.836582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-nqld6" event={"ID":"0cc0ab62-60fe-454e-bf85-90ca6410b3d1","Type":"ContainerStarted","Data":"4da34ebe8a7e7da73b0061de0bbb68bf1f19f50dafe44395091d44f69cb20fcf"} Apr 20 20:06:27.851193 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:27.851142 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-nqld6" podStartSLOduration=5.221582323 podStartE2EDuration="22.851126133s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.588938623 +0000 UTC m=+3.436481648" lastFinishedPulling="2026-04-20 20:06:26.218482425 +0000 UTC m=+21.066025458" observedRunningTime="2026-04-20 20:06:27.850950935 +0000 UTC m=+22.698493976" watchObservedRunningTime="2026-04-20 20:06:27.851126133 +0000 UTC m=+22.698669175" Apr 20 20:06:28.039938 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:28.039904 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 20:06:28.659298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:28.659031 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T20:06:28.039931629Z","UUID":"c5e112ba-0bc7-49c6-844c-18f7f19db699","Handler":null,"Name":"","Endpoint":""} Apr 20 20:06:28.662134 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:28.662101 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 20:06:28.662134 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:28.662141 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 20:06:28.731617 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:28.731572 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:28.731774 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:28.731691 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:28.840499 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:28.840432 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" event={"ID":"bcdae5fc-2620-4126-a962-272f1c67124b","Type":"ContainerStarted","Data":"fc2a04b934a98a50b3578ac5e3b762916aa638e7566fd905a2175b1b7e23bc1e"} Apr 20 20:06:29.627720 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:29.627474 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:29.732183 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:29.732141 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:29.732364 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:29.732302 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:29.844279 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:29.844237 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" event={"ID":"bcdae5fc-2620-4126-a962-272f1c67124b","Type":"ContainerStarted","Data":"8aeaa72cf4f44bdea5d32480078a50aa8d18aee652916f4968c672f4b61db23d"} Apr 20 20:06:29.847419 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:29.847386 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:06:29.847764 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:29.847737 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"49a4bf3be76462cbf9e0ffadf048fadb8ef0c10173f656bf3365f2e9f5126e54"} Apr 20 20:06:29.863036 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:29.862977 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-cpgcg" podStartSLOduration=4.282381574 podStartE2EDuration="24.862957512s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.592117886 +0000 UTC m=+3.439660905" lastFinishedPulling="2026-04-20 20:06:29.172693825 +0000 UTC m=+24.020236843" observedRunningTime="2026-04-20 20:06:29.862543562 +0000 UTC m=+24.710086606" watchObservedRunningTime="2026-04-20 20:06:29.862957512 +0000 UTC m=+24.710500555" Apr 20 20:06:30.731552 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:30.731515 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:30.731768 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:30.731656 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:31.119213 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.119180 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:31.119927 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.119909 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:31.732332 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.732067 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:31.732473 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:31.732339 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:31.854363 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.854333 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:06:31.854691 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.854660 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"694a15023bafcd0a1b4a1a44131447ef1c9248f8ab3e6ee64ae2feef6d5a7888"} Apr 20 20:06:31.856119 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.856080 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:31.856265 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.856159 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:31.856315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.856275 2579 generic.go:358] "Generic (PLEG): container finished" podID="9774d5fe-006d-4f52-a1aa-6ab55dcf9946" containerID="71786a4e23de5cf49a33e060367c6061753cbf283cde1436a8cf16673f2c741f" exitCode=0 Apr 20 20:06:31.856390 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.856371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerDied","Data":"71786a4e23de5cf49a33e060367c6061753cbf283cde1436a8cf16673f2c741f"} Apr 20 20:06:31.856456 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.856377 2579 scope.go:117] "RemoveContainer" containerID="336c72248edb053637aceab2e542ca082250a42613491ebeecb407a52101526d" Apr 20 20:06:31.857133 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.857118 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-t7s46" Apr 20 20:06:31.873185 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:31.873162 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:32.731561 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:32.731528 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:32.732234 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:32.731633 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:32.864731 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:32.864697 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:06:32.865086 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:32.865055 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" event={"ID":"1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac","Type":"ContainerStarted","Data":"582c728cdac118dcb25a2a2d220877c43b4559ce150e59659d1ff22bebaf6ed7"} Apr 20 20:06:32.865589 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:32.865565 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:32.881922 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:32.881894 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:06:32.895476 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:32.895426 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" podStartSLOduration=10.240071718 podStartE2EDuration="27.895409587s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.588174704 +0000 UTC m=+3.435717724" lastFinishedPulling="2026-04-20 20:06:26.243512556 +0000 UTC m=+21.091055593" observedRunningTime="2026-04-20 20:06:32.894941916 +0000 UTC m=+27.742484959" watchObservedRunningTime="2026-04-20 20:06:32.895409587 +0000 UTC m=+27.742952628" Apr 20 20:06:33.333792 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:33.333758 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7wqd"] Apr 20 20:06:33.334003 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:33.333804 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ncwtr"] Apr 20 20:06:33.334003 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:33.333980 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:33.334203 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:33.334151 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:33.334595 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:33.334575 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:33.334715 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:33.334671 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:33.869222 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:33.869040 2579 generic.go:358] "Generic (PLEG): container finished" podID="9774d5fe-006d-4f52-a1aa-6ab55dcf9946" containerID="a5564aa89e2b04cf5ec60f83e1dbfcdf13f5a2ba0f35ac9c9cf474795046e7b3" exitCode=0 Apr 20 20:06:33.869222 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:33.869127 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerDied","Data":"a5564aa89e2b04cf5ec60f83e1dbfcdf13f5a2ba0f35ac9c9cf474795046e7b3"} Apr 20 20:06:34.731413 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:34.731380 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:34.731596 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:34.731514 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:34.731596 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:34.731582 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:34.731717 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:34.731694 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:35.876802 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:35.876765 2579 generic.go:358] "Generic (PLEG): container finished" podID="9774d5fe-006d-4f52-a1aa-6ab55dcf9946" containerID="07c1f6d11ff5eec725a1e9bcb3d06094ba916b27336d53b3a44a76c6096f4e1f" exitCode=0 Apr 20 20:06:35.877273 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:35.876816 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerDied","Data":"07c1f6d11ff5eec725a1e9bcb3d06094ba916b27336d53b3a44a76c6096f4e1f"} Apr 20 20:06:36.732006 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:36.731966 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:36.732210 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:36.732026 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:36.732210 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:36.732112 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:36.732326 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:36.732243 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:38.731486 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:38.731448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:38.732302 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:38.731448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:38.732302 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:38.731571 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ncwtr" podUID="f462fd2a-e95e-4bb6-b40a-76ad1cf91759" Apr 20 20:06:38.732302 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:38.731641 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:06:39.345438 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.345350 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:39.345594 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.345543 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:39.345657 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.345608 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:07:11.34558955 +0000 UTC m=+66.193132573 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 20:06:39.538035 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.538000 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-130.ec2.internal" event="NodeReady" Apr 20 20:06:39.538215 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.538173 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 20:06:39.547137 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.547100 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:39.547319 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.547290 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 20:06:39.547319 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.547312 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 20:06:39.547424 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.547323 2579 projected.go:194] Error preparing data for projected volume kube-api-access-9hj65 for pod openshift-network-diagnostics/network-check-target-ncwtr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:39.547486 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.547435 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65 podName:f462fd2a-e95e-4bb6-b40a-76ad1cf91759 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:11.547415686 +0000 UTC m=+66.394958727 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-9hj65" (UniqueName: "kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65") pod "network-check-target-ncwtr" (UID: "f462fd2a-e95e-4bb6-b40a-76ad1cf91759") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 20:06:39.580272 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.580232 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-54k7f"] Apr 20 20:06:39.582792 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.582656 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.583628 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.583603 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-zq94z"] Apr 20 20:06:39.585097 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.585076 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 20:06:39.585414 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.585388 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 20:06:39.585573 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.585478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:39.585922 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.585906 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhk8x\"" Apr 20 20:06:39.587462 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.587437 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 20:06:39.587691 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.587673 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 20:06:39.587691 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.587687 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xwtbs\"" Apr 20 20:06:39.588188 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.588170 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 20:06:39.592585 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.592369 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-54k7f"] Apr 20 20:06:39.597461 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.597416 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zq94z"] Apr 20 20:06:39.748608 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.748572 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:39.749223 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.748634 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.749223 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.748694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a614e742-616a-48e4-bb64-1c023ed6fecf-config-volume\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.749223 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.748716 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhrx4\" (UniqueName: \"kubernetes.io/projected/a614e742-616a-48e4-bb64-1c023ed6fecf-kube-api-access-lhrx4\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.749223 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.748735 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zjzr\" (UniqueName: \"kubernetes.io/projected/ac55a3c1-11b3-4209-b73c-91b790e26c17-kube-api-access-6zjzr\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:39.749223 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.748841 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a614e742-616a-48e4-bb64-1c023ed6fecf-tmp-dir\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.849618 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.849527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a614e742-616a-48e4-bb64-1c023ed6fecf-config-volume\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.849618 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.849577 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhrx4\" (UniqueName: \"kubernetes.io/projected/a614e742-616a-48e4-bb64-1c023ed6fecf-kube-api-access-lhrx4\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.849618 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.849601 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zjzr\" (UniqueName: \"kubernetes.io/projected/ac55a3c1-11b3-4209-b73c-91b790e26c17-kube-api-access-6zjzr\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:39.849870 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.849636 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a614e742-616a-48e4-bb64-1c023ed6fecf-tmp-dir\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.849870 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.849662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:39.849870 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.849688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.849870 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.849775 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:39.849870 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.849791 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:39.849870 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.849839 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:06:40.349819291 +0000 UTC m=+35.197362316 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:06:39.849870 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:39.849857 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:40.349848464 +0000 UTC m=+35.197391497 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:06:39.850233 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.850051 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a614e742-616a-48e4-bb64-1c023ed6fecf-tmp-dir\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.850288 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.850257 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a614e742-616a-48e4-bb64-1c023ed6fecf-config-volume\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:39.865126 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.865098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zjzr\" (UniqueName: \"kubernetes.io/projected/ac55a3c1-11b3-4209-b73c-91b790e26c17-kube-api-access-6zjzr\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:39.867331 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:39.867301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhrx4\" (UniqueName: \"kubernetes.io/projected/a614e742-616a-48e4-bb64-1c023ed6fecf-kube-api-access-lhrx4\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:40.353478 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.353423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:40.353655 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.353490 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:40.353655 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:40.353560 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:40.353655 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:40.353604 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:40.353655 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:40.353632 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:41.353615562 +0000 UTC m=+36.201158594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:06:40.353655 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:40.353652 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:06:41.353640228 +0000 UTC m=+36.201183246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:06:40.731977 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.731933 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:06:40.732157 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.731990 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:06:40.735942 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.735904 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:06:40.735942 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.735912 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4tpsm\"" Apr 20 20:06:40.736133 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.735974 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9pkw7\"" Apr 20 20:06:40.736133 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.736093 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:06:40.736300 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:40.735827 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:06:41.361864 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:41.361816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:41.361864 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:41.361866 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:41.362292 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:41.361989 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:41.362292 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:41.362049 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:43.362035093 +0000 UTC m=+38.209578112 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:06:41.362292 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:41.362049 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:41.362292 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:41.362105 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:06:43.362090834 +0000 UTC m=+38.209633857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:06:42.893002 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:42.892789 2579 generic.go:358] "Generic (PLEG): container finished" podID="9774d5fe-006d-4f52-a1aa-6ab55dcf9946" containerID="1f995b8704970df28195ac6a65383bd3ae27b938f7a75e384abc867a849c4b4e" exitCode=0 Apr 20 20:06:42.893002 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:42.892866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerDied","Data":"1f995b8704970df28195ac6a65383bd3ae27b938f7a75e384abc867a849c4b4e"} Apr 20 20:06:43.375791 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:43.375749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:43.376003 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:43.375843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:43.376003 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:43.375955 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:43.376003 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:43.375966 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:43.376102 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:43.376023 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:06:47.376005112 +0000 UTC m=+42.223548146 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:06:43.376102 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:43.376037 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:47.376031335 +0000 UTC m=+42.223574360 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:06:43.897861 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:43.897831 2579 generic.go:358] "Generic (PLEG): container finished" podID="9774d5fe-006d-4f52-a1aa-6ab55dcf9946" containerID="3e5e8c5910ab52ef3d4c7bcba31fd3bd892a12e05bc5aafc9336dfd8d3bff49a" exitCode=0 Apr 20 20:06:43.897861 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:43.897866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerDied","Data":"3e5e8c5910ab52ef3d4c7bcba31fd3bd892a12e05bc5aafc9336dfd8d3bff49a"} Apr 20 20:06:44.902959 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:44.902901 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" event={"ID":"9774d5fe-006d-4f52-a1aa-6ab55dcf9946","Type":"ContainerStarted","Data":"7c70f15a60682fffcb9570403f552a29d8c0625992e62561dce3bd922775a129"} Apr 20 20:06:44.929500 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:44.929433 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-vrb5p" podStartSLOduration=6.692988656 podStartE2EDuration="39.929403726s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:06:08.596620665 +0000 UTC m=+3.444163684" lastFinishedPulling="2026-04-20 20:06:41.833035732 +0000 UTC m=+36.680578754" observedRunningTime="2026-04-20 20:06:44.929061813 +0000 UTC m=+39.776604856" watchObservedRunningTime="2026-04-20 20:06:44.929403726 +0000 UTC m=+39.776946766" Apr 20 20:06:47.405363 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:47.405318 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:47.405363 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:47.405365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:47.405781 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:47.405465 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:47.405781 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:47.405468 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:47.405781 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:47.405530 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:06:55.405515529 +0000 UTC m=+50.253058552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:06:47.405781 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:47.405544 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:06:55.405538109 +0000 UTC m=+50.253081132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:06:55.461423 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:55.461379 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:06:55.461423 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:06:55.461426 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:06:55.462014 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:55.461544 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:06:55.462014 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:55.461599 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:07:11.461582781 +0000 UTC m=+66.309125800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:06:55.462014 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:55.461665 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:06:55.462014 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:06:55.461729 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:11.46171277 +0000 UTC m=+66.309255798 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:07:04.882777 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:04.882745 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-48d2t" Apr 20 20:07:11.379779 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.379731 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:07:11.382132 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.382111 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 20:07:11.390932 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:11.390900 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:07:11.391030 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:11.390996 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:08:15.390977176 +0000 UTC m=+130.238520195 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : secret "metrics-daemon-secret" not found Apr 20 20:07:11.480204 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.480173 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:07:11.480306 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.480209 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:07:11.480361 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:11.480325 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:11.480361 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:11.480325 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:11.480434 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:11.480404 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:07:43.480388107 +0000 UTC m=+98.327931129 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:07:11.480434 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:11.480419 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:07:43.480413154 +0000 UTC m=+98.327956177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:07:11.580500 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.580459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:07:11.583205 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.583186 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 20:07:11.593394 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.593370 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 20:07:11.604591 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.604564 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hj65\" (UniqueName: \"kubernetes.io/projected/f462fd2a-e95e-4bb6-b40a-76ad1cf91759-kube-api-access-9hj65\") pod \"network-check-target-ncwtr\" (UID: \"f462fd2a-e95e-4bb6-b40a-76ad1cf91759\") " pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:07:11.650206 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.650126 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9pkw7\"" Apr 20 20:07:11.658169 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.658135 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:07:11.831119 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.831090 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ncwtr"] Apr 20 20:07:11.834568 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:07:11.834537 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf462fd2a_e95e_4bb6_b40a_76ad1cf91759.slice/crio-928c64217d9d622728e4b9e2f77b75e5383f8ab55992fa8d90b778590dd8d848 WatchSource:0}: Error finding container 928c64217d9d622728e4b9e2f77b75e5383f8ab55992fa8d90b778590dd8d848: Status 404 returned error can't find the container with id 928c64217d9d622728e4b9e2f77b75e5383f8ab55992fa8d90b778590dd8d848 Apr 20 20:07:11.958129 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:11.958032 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ncwtr" event={"ID":"f462fd2a-e95e-4bb6-b40a-76ad1cf91759","Type":"ContainerStarted","Data":"928c64217d9d622728e4b9e2f77b75e5383f8ab55992fa8d90b778590dd8d848"} Apr 20 20:07:14.965913 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:14.965858 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ncwtr" event={"ID":"f462fd2a-e95e-4bb6-b40a-76ad1cf91759","Type":"ContainerStarted","Data":"054d5d966d8818be944ea1b0f7ad72de00ab8962fe61aa952d68b9387e1e62db"} Apr 20 20:07:14.966317 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:14.966018 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:07:14.983772 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:14.983619 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ncwtr" podStartSLOduration=66.988699882 podStartE2EDuration="1m9.983600358s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:07:11.836338378 +0000 UTC m=+66.683881401" lastFinishedPulling="2026-04-20 20:07:14.831238855 +0000 UTC m=+69.678781877" observedRunningTime="2026-04-20 20:07:14.982897269 +0000 UTC m=+69.830440301" watchObservedRunningTime="2026-04-20 20:07:14.983600358 +0000 UTC m=+69.831143400" Apr 20 20:07:43.516191 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:43.516146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:07:43.516622 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:43.516219 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:07:43.516622 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:43.516293 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 20:07:43.516622 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:43.516310 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 20:07:43.516622 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:43.516380 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls podName:a614e742-616a-48e4-bb64-1c023ed6fecf nodeName:}" failed. No retries permitted until 2026-04-20 20:08:47.516354829 +0000 UTC m=+162.363897851 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls") pod "dns-default-54k7f" (UID: "a614e742-616a-48e4-bb64-1c023ed6fecf") : secret "dns-default-metrics-tls" not found Apr 20 20:07:43.516622 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:07:43.516395 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert podName:ac55a3c1-11b3-4209-b73c-91b790e26c17 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:47.516388508 +0000 UTC m=+162.363931532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert") pod "ingress-canary-zq94z" (UID: "ac55a3c1-11b3-4209-b73c-91b790e26c17") : secret "canary-serving-cert" not found Apr 20 20:07:45.970559 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:07:45.970527 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ncwtr" Apr 20 20:08:09.913051 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.913017 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9"] Apr 20 20:08:09.914750 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.914733 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" Apr 20 20:08:09.917719 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.917614 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 20:08:09.918027 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.917908 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:08:09.918540 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.918514 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-h7g8n\"" Apr 20 20:08:09.920557 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.920535 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q"] Apr 20 20:08:09.922513 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.922496 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw"] Apr 20 20:08:09.922664 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.922648 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:09.924188 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.924171 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j"] Apr 20 20:08:09.924334 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.924316 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:09.926160 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.926138 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:09.927424 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.927408 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 20:08:09.927526 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.927408 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 20 20:08:09.928466 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.928446 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 20 20:08:09.929365 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929346 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 20:08:09.929445 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929415 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9"] Apr 20 20:08:09.929767 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929750 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 20 20:08:09.929862 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929812 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 20 20:08:09.929862 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929846 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5vqcl\"" Apr 20 20:08:09.929993 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929921 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 20 20:08:09.929993 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929951 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 20 20:08:09.929993 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929960 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:08:09.930141 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.929952 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:08:09.930230 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.930214 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 20 20:08:09.930328 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.930313 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-bnvt7\"" Apr 20 20:08:09.930423 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.930407 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-r8p82\"" Apr 20 20:08:09.941847 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.941814 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q"] Apr 20 20:08:09.942490 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.942467 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw"] Apr 20 20:08:09.946180 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:09.946014 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j"] Apr 20 20:08:10.001344 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.001304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cb9p\" (UniqueName: \"kubernetes.io/projected/43faf1e3-dfc2-49bf-807e-cef46ba72873-kube-api-access-6cb9p\") pod \"volume-data-source-validator-7c6cbb6c87-sxmj9\" (UID: \"43faf1e3-dfc2-49bf-807e-cef46ba72873\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" Apr 20 20:08:10.001344 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.001346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41883eb-364a-40b1-af2a-729a431739b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.001592 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.001371 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng8qx\" (UniqueName: \"kubernetes.io/projected/c79c4964-96fa-4fde-9a9d-7928a8841145-kube-api-access-ng8qx\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.001592 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.001465 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c79c4964-96fa-4fde-9a9d-7928a8841145-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.001592 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.001539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.001592 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.001571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dldkv\" (UniqueName: \"kubernetes.io/projected/b41883eb-364a-40b1-af2a-729a431739b1-kube-api-access-dldkv\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.001757 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.001618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41883eb-364a-40b1-af2a-729a431739b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.017597 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.017566 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v"] Apr 20 20:08:10.019571 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.019554 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.022723 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.022698 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 20:08:10.023016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.023006 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:08:10.023756 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.023734 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-htv85\"" Apr 20 20:08:10.024004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.023987 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 20:08:10.025489 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.025468 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-69djd"] Apr 20 20:08:10.027372 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.027353 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 20:08:10.027891 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.027748 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg"] Apr 20 20:08:10.027891 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.027809 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.029637 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.029617 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" Apr 20 20:08:10.030838 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.030822 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 20 20:08:10.031101 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.031088 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 20 20:08:10.031452 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.031434 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 20 20:08:10.031664 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.031649 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 20 20:08:10.031716 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.031688 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-vn65c\"" Apr 20 20:08:10.031908 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.031894 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-mmsnx\"" Apr 20 20:08:10.037967 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.037944 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 20 20:08:10.038570 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.038549 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v"] Apr 20 20:08:10.042659 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.042634 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-69djd"] Apr 20 20:08:10.047052 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.047033 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg"] Apr 20 20:08:10.102429 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c79c4964-96fa-4fde-9a9d-7928a8841145-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.102619 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmqw\" (UniqueName: \"kubernetes.io/projected/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-kube-api-access-5wmqw\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.102619 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102477 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.102619 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dldkv\" (UniqueName: \"kubernetes.io/projected/b41883eb-364a-40b1-af2a-729a431739b1-kube-api-access-dldkv\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.102619 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102530 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k45nn\" (UniqueName: \"kubernetes.io/projected/17c337e8-f179-4bd2-b623-1741863e9278-kube-api-access-k45nn\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:10.102619 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102573 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.102619 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102599 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-config\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.103132 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102626 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:10.103132 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.102602 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:10.103132 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41883eb-364a-40b1-af2a-729a431739b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.103132 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102673 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cb9p\" (UniqueName: \"kubernetes.io/projected/43faf1e3-dfc2-49bf-807e-cef46ba72873-kube-api-access-6cb9p\") pod \"volume-data-source-validator-7c6cbb6c87-sxmj9\" (UID: \"43faf1e3-dfc2-49bf-807e-cef46ba72873\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" Apr 20 20:08:10.103132 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.102722 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls podName:c79c4964-96fa-4fde-9a9d-7928a8841145 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:10.60269641 +0000 UTC m=+125.450239438 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pfg5q" (UID: "c79c4964-96fa-4fde-9a9d-7928a8841145") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:10.103132 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41883eb-364a-40b1-af2a-729a431739b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.103132 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.102773 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ng8qx\" (UniqueName: \"kubernetes.io/projected/c79c4964-96fa-4fde-9a9d-7928a8841145-kube-api-access-ng8qx\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.103463 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.103184 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b41883eb-364a-40b1-af2a-729a431739b1-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.103463 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.103307 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/c79c4964-96fa-4fde-9a9d-7928a8841145-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.105285 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.105259 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b41883eb-364a-40b1-af2a-729a431739b1-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.114428 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.114392 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng8qx\" (UniqueName: \"kubernetes.io/projected/c79c4964-96fa-4fde-9a9d-7928a8841145-kube-api-access-ng8qx\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.114428 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.114403 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cb9p\" (UniqueName: \"kubernetes.io/projected/43faf1e3-dfc2-49bf-807e-cef46ba72873-kube-api-access-6cb9p\") pod \"volume-data-source-validator-7c6cbb6c87-sxmj9\" (UID: \"43faf1e3-dfc2-49bf-807e-cef46ba72873\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" Apr 20 20:08:10.114584 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.114561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dldkv\" (UniqueName: \"kubernetes.io/projected/b41883eb-364a-40b1-af2a-729a431739b1-kube-api-access-dldkv\") pod \"kube-storage-version-migrator-operator-6769c5d45-l42zw\" (UID: \"b41883eb-364a-40b1-af2a-729a431739b1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.203298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmqw\" (UniqueName: \"kubernetes.io/projected/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-kube-api-access-5wmqw\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.203298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203279 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k45nn\" (UniqueName: \"kubernetes.io/projected/17c337e8-f179-4bd2-b623-1741863e9278-kube-api-access-k45nn\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:10.203298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc299172-ad2d-4bef-a2b2-de75054e20b5-trusted-ca\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.203563 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.203563 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-config\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.203563 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203414 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st92b\" (UniqueName: \"kubernetes.io/projected/7f4a42ba-8098-4e0b-b398-f97f5e0c71d3-kube-api-access-st92b\") pod \"network-check-source-8894fc9bd-6m4mg\" (UID: \"7f4a42ba-8098-4e0b-b398-f97f5e0c71d3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" Apr 20 20:08:10.203563 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203481 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc299172-ad2d-4bef-a2b2-de75054e20b5-config\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.203563 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:10.203750 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc299172-ad2d-4bef-a2b2-de75054e20b5-serving-cert\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.203750 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203623 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqkp7\" (UniqueName: \"kubernetes.io/projected/fc299172-ad2d-4bef-a2b2-de75054e20b5-kube-api-access-mqkp7\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.203750 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.203580 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:08:10.203750 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.203728 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls podName:17c337e8-f179-4bd2-b623-1741863e9278 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:10.703705933 +0000 UTC m=+125.551248968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-n4p5j" (UID: "17c337e8-f179-4bd2-b623-1741863e9278") : secret "samples-operator-tls" not found Apr 20 20:08:10.204003 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.203984 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-config\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.205846 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.205829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-serving-cert\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.215977 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.215945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmqw\" (UniqueName: \"kubernetes.io/projected/09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd-kube-api-access-5wmqw\") pod \"service-ca-operator-d6fc45fc5-28z6v\" (UID: \"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.216108 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.216079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k45nn\" (UniqueName: \"kubernetes.io/projected/17c337e8-f179-4bd2-b623-1741863e9278-kube-api-access-k45nn\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:10.228737 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.228705 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" Apr 20 20:08:10.242750 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.242707 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" Apr 20 20:08:10.305029 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.304995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc299172-ad2d-4bef-a2b2-de75054e20b5-trusted-ca\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.305248 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.305066 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-st92b\" (UniqueName: \"kubernetes.io/projected/7f4a42ba-8098-4e0b-b398-f97f5e0c71d3-kube-api-access-st92b\") pod \"network-check-source-8894fc9bd-6m4mg\" (UID: \"7f4a42ba-8098-4e0b-b398-f97f5e0c71d3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" Apr 20 20:08:10.305248 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.305094 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc299172-ad2d-4bef-a2b2-de75054e20b5-config\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.305248 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.305133 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc299172-ad2d-4bef-a2b2-de75054e20b5-serving-cert\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.305248 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.305165 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqkp7\" (UniqueName: \"kubernetes.io/projected/fc299172-ad2d-4bef-a2b2-de75054e20b5-kube-api-access-mqkp7\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.306283 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.306221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc299172-ad2d-4bef-a2b2-de75054e20b5-config\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.306591 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.306526 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc299172-ad2d-4bef-a2b2-de75054e20b5-trusted-ca\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.308721 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.308673 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc299172-ad2d-4bef-a2b2-de75054e20b5-serving-cert\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.315304 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.315268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-st92b\" (UniqueName: \"kubernetes.io/projected/7f4a42ba-8098-4e0b-b398-f97f5e0c71d3-kube-api-access-st92b\") pod \"network-check-source-8894fc9bd-6m4mg\" (UID: \"7f4a42ba-8098-4e0b-b398-f97f5e0c71d3\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" Apr 20 20:08:10.316135 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.316086 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqkp7\" (UniqueName: \"kubernetes.io/projected/fc299172-ad2d-4bef-a2b2-de75054e20b5-kube-api-access-mqkp7\") pod \"console-operator-9d4b6777b-69djd\" (UID: \"fc299172-ad2d-4bef-a2b2-de75054e20b5\") " pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.329512 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.329433 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" Apr 20 20:08:10.338387 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.338251 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:10.343841 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.343815 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" Apr 20 20:08:10.357223 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.357170 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9"] Apr 20 20:08:10.360887 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:10.360828 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43faf1e3_dfc2_49bf_807e_cef46ba72873.slice/crio-b72475bd9b00b82831f46157a556b48669605b1b66a363042d066ee6244ccce9 WatchSource:0}: Error finding container b72475bd9b00b82831f46157a556b48669605b1b66a363042d066ee6244ccce9: Status 404 returned error can't find the container with id b72475bd9b00b82831f46157a556b48669605b1b66a363042d066ee6244ccce9 Apr 20 20:08:10.375149 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.375118 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw"] Apr 20 20:08:10.377726 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:10.377676 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb41883eb_364a_40b1_af2a_729a431739b1.slice/crio-83093a44a0e3be1cb17b7386bb021f22280c625aa311043c7945fb6e439e7ee4 WatchSource:0}: Error finding container 83093a44a0e3be1cb17b7386bb021f22280c625aa311043c7945fb6e439e7ee4: Status 404 returned error can't find the container with id 83093a44a0e3be1cb17b7386bb021f22280c625aa311043c7945fb6e439e7ee4 Apr 20 20:08:10.475130 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.475101 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v"] Apr 20 20:08:10.478945 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:10.478913 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d9eaa7_0f7e_4ce2_9009_ccea0b0f2ddd.slice/crio-0af9199207d8e149c1d4a1251a1e2e37de22eaf8131b1ccb55f3c16c63c11d52 WatchSource:0}: Error finding container 0af9199207d8e149c1d4a1251a1e2e37de22eaf8131b1ccb55f3c16c63c11d52: Status 404 returned error can't find the container with id 0af9199207d8e149c1d4a1251a1e2e37de22eaf8131b1ccb55f3c16c63c11d52 Apr 20 20:08:10.608140 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.608097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:10.608341 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.608277 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:10.608411 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.608403 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls podName:c79c4964-96fa-4fde-9a9d-7928a8841145 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:11.608380171 +0000 UTC m=+126.455923214 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pfg5q" (UID: "c79c4964-96fa-4fde-9a9d-7928a8841145") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:10.695868 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.695836 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg"] Apr 20 20:08:10.699201 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:10.699139 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f4a42ba_8098_4e0b_b398_f97f5e0c71d3.slice/crio-cbd3890b3b14c9de17188167e41e06a46eebd53b584c920bbf3a0265fe8b3422 WatchSource:0}: Error finding container cbd3890b3b14c9de17188167e41e06a46eebd53b584c920bbf3a0265fe8b3422: Status 404 returned error can't find the container with id cbd3890b3b14c9de17188167e41e06a46eebd53b584c920bbf3a0265fe8b3422 Apr 20 20:08:10.699839 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.699820 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-69djd"] Apr 20 20:08:10.702989 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:10.702966 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc299172_ad2d_4bef_a2b2_de75054e20b5.slice/crio-19addaab2f0632f4292aee1829b23d82b3c3b781db4d2d9fe6bca087865f1f5c WatchSource:0}: Error finding container 19addaab2f0632f4292aee1829b23d82b3c3b781db4d2d9fe6bca087865f1f5c: Status 404 returned error can't find the container with id 19addaab2f0632f4292aee1829b23d82b3c3b781db4d2d9fe6bca087865f1f5c Apr 20 20:08:10.708676 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:10.708655 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:10.708813 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.708796 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:08:10.708897 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:10.708856 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls podName:17c337e8-f179-4bd2-b623-1741863e9278 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:11.708840761 +0000 UTC m=+126.556383780 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-n4p5j" (UID: "17c337e8-f179-4bd2-b623-1741863e9278") : secret "samples-operator-tls" not found Apr 20 20:08:11.082676 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.082582 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" event={"ID":"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd","Type":"ContainerStarted","Data":"0af9199207d8e149c1d4a1251a1e2e37de22eaf8131b1ccb55f3c16c63c11d52"} Apr 20 20:08:11.084422 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.084380 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" event={"ID":"43faf1e3-dfc2-49bf-807e-cef46ba72873","Type":"ContainerStarted","Data":"b72475bd9b00b82831f46157a556b48669605b1b66a363042d066ee6244ccce9"} Apr 20 20:08:11.086220 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.086175 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" event={"ID":"b41883eb-364a-40b1-af2a-729a431739b1","Type":"ContainerStarted","Data":"83093a44a0e3be1cb17b7386bb021f22280c625aa311043c7945fb6e439e7ee4"} Apr 20 20:08:11.088329 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.088295 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" event={"ID":"7f4a42ba-8098-4e0b-b398-f97f5e0c71d3","Type":"ContainerStarted","Data":"b85476bf948741ec07a145d6e44e12f6c4d12ddb992bfcf75e740bd638ec0fca"} Apr 20 20:08:11.088329 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.088331 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" event={"ID":"7f4a42ba-8098-4e0b-b398-f97f5e0c71d3","Type":"ContainerStarted","Data":"cbd3890b3b14c9de17188167e41e06a46eebd53b584c920bbf3a0265fe8b3422"} Apr 20 20:08:11.090723 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.090693 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" event={"ID":"fc299172-ad2d-4bef-a2b2-de75054e20b5","Type":"ContainerStarted","Data":"19addaab2f0632f4292aee1829b23d82b3c3b781db4d2d9fe6bca087865f1f5c"} Apr 20 20:08:11.105489 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.105108 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-6m4mg" podStartSLOduration=1.105084855 podStartE2EDuration="1.105084855s" podCreationTimestamp="2026-04-20 20:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:11.104606873 +0000 UTC m=+125.952149914" watchObservedRunningTime="2026-04-20 20:08:11.105084855 +0000 UTC m=+125.952627896" Apr 20 20:08:11.618605 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.618551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:11.618818 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:11.618800 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:11.618907 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:11.618895 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls podName:c79c4964-96fa-4fde-9a9d-7928a8841145 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:13.618859202 +0000 UTC m=+128.466402229 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pfg5q" (UID: "c79c4964-96fa-4fde-9a9d-7928a8841145") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:11.719412 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:11.719368 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:11.719597 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:11.719542 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:08:11.719661 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:11.719613 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls podName:17c337e8-f179-4bd2-b623-1741863e9278 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:13.719591596 +0000 UTC m=+128.567134630 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-n4p5j" (UID: "17c337e8-f179-4bd2-b623-1741863e9278") : secret "samples-operator-tls" not found Apr 20 20:08:13.635815 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:13.635751 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:13.636354 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:13.635864 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:13.636354 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:13.635957 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls podName:c79c4964-96fa-4fde-9a9d-7928a8841145 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:17.635943295 +0000 UTC m=+132.483486313 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pfg5q" (UID: "c79c4964-96fa-4fde-9a9d-7928a8841145") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:13.739029 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:13.738539 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:13.739029 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:13.738727 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:08:13.739029 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:13.738870 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls podName:17c337e8-f179-4bd2-b623-1741863e9278 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:17.738817624 +0000 UTC m=+132.586360648 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-n4p5j" (UID: "17c337e8-f179-4bd2-b623-1741863e9278") : secret "samples-operator-tls" not found Apr 20 20:08:14.101243 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.101197 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" event={"ID":"43faf1e3-dfc2-49bf-807e-cef46ba72873","Type":"ContainerStarted","Data":"cbdd9fdbe952c1100d8a9ea58949b38e6087a2e8656f86293722b01fdfb0b369"} Apr 20 20:08:14.102602 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.102572 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" event={"ID":"b41883eb-364a-40b1-af2a-729a431739b1","Type":"ContainerStarted","Data":"dd3b7068a572639e564958e1596a277b0aeda7de595c79b7670f9ccadfcce7ae"} Apr 20 20:08:14.104112 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.104088 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/0.log" Apr 20 20:08:14.104341 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.104130 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc299172-ad2d-4bef-a2b2-de75054e20b5" containerID="c3d47daae913b5a9e360a9d0df138de592a6aef731963487a86369bf15660c07" exitCode=255 Apr 20 20:08:14.104341 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.104218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" event={"ID":"fc299172-ad2d-4bef-a2b2-de75054e20b5","Type":"ContainerDied","Data":"c3d47daae913b5a9e360a9d0df138de592a6aef731963487a86369bf15660c07"} Apr 20 20:08:14.104514 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.104432 2579 scope.go:117] "RemoveContainer" containerID="c3d47daae913b5a9e360a9d0df138de592a6aef731963487a86369bf15660c07" Apr 20 20:08:14.105845 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.105823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" event={"ID":"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd","Type":"ContainerStarted","Data":"dbdefc6865e7401770c26f9ca3fb086ba0314d547f5b0520ec8c7411bda2018e"} Apr 20 20:08:14.117095 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.117040 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-sxmj9" podStartSLOduration=1.910227049 podStartE2EDuration="5.117019704s" podCreationTimestamp="2026-04-20 20:08:09 +0000 UTC" firstStartedPulling="2026-04-20 20:08:10.363059635 +0000 UTC m=+125.210602657" lastFinishedPulling="2026-04-20 20:08:13.569852293 +0000 UTC m=+128.417395312" observedRunningTime="2026-04-20 20:08:14.116181591 +0000 UTC m=+128.963724632" watchObservedRunningTime="2026-04-20 20:08:14.117019704 +0000 UTC m=+128.964562747" Apr 20 20:08:14.150272 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.150208 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" podStartSLOduration=1.958287138 podStartE2EDuration="5.150188628s" podCreationTimestamp="2026-04-20 20:08:09 +0000 UTC" firstStartedPulling="2026-04-20 20:08:10.383303154 +0000 UTC m=+125.230846186" lastFinishedPulling="2026-04-20 20:08:13.575204657 +0000 UTC m=+128.422747676" observedRunningTime="2026-04-20 20:08:14.150069129 +0000 UTC m=+128.997612170" watchObservedRunningTime="2026-04-20 20:08:14.150188628 +0000 UTC m=+128.997731669" Apr 20 20:08:14.168259 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:14.168203 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" podStartSLOduration=1.072050934 podStartE2EDuration="4.168183232s" podCreationTimestamp="2026-04-20 20:08:10 +0000 UTC" firstStartedPulling="2026-04-20 20:08:10.480850127 +0000 UTC m=+125.328393147" lastFinishedPulling="2026-04-20 20:08:13.576982425 +0000 UTC m=+128.424525445" observedRunningTime="2026-04-20 20:08:14.166214196 +0000 UTC m=+129.013757239" watchObservedRunningTime="2026-04-20 20:08:14.168183232 +0000 UTC m=+129.015726275" Apr 20 20:08:15.110068 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.110035 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:08:15.110509 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.110454 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/0.log" Apr 20 20:08:15.110509 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.110487 2579 generic.go:358] "Generic (PLEG): container finished" podID="fc299172-ad2d-4bef-a2b2-de75054e20b5" containerID="d6ee725b9cf79ac119c2c8e58b8d8d8eefd20a65a49b9ed8a873fd5370e7a0d4" exitCode=255 Apr 20 20:08:15.110635 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.110610 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" event={"ID":"fc299172-ad2d-4bef-a2b2-de75054e20b5","Type":"ContainerDied","Data":"d6ee725b9cf79ac119c2c8e58b8d8d8eefd20a65a49b9ed8a873fd5370e7a0d4"} Apr 20 20:08:15.110689 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.110655 2579 scope.go:117] "RemoveContainer" containerID="c3d47daae913b5a9e360a9d0df138de592a6aef731963487a86369bf15660c07" Apr 20 20:08:15.110890 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.110854 2579 scope.go:117] "RemoveContainer" containerID="d6ee725b9cf79ac119c2c8e58b8d8d8eefd20a65a49b9ed8a873fd5370e7a0d4" Apr 20 20:08:15.111106 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:15.111087 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-69djd_openshift-console-operator(fc299172-ad2d-4bef-a2b2-de75054e20b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" podUID="fc299172-ad2d-4bef-a2b2-de75054e20b5" Apr 20 20:08:15.392908 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.392809 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n52n7_f4f29d83-69da-4bc7-a3ce-9bcc03a224ed/dns-node-resolver/0.log" Apr 20 20:08:15.453836 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.453800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:08:15.454022 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:15.453987 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 20:08:15.454101 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:15.454068 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs podName:5020f6ce-7061-484f-9b7d-89141a36e42c nodeName:}" failed. No retries permitted until 2026-04-20 20:10:17.454047013 +0000 UTC m=+252.301590036 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs") pod "network-metrics-daemon-g7wqd" (UID: "5020f6ce-7061-484f-9b7d-89141a36e42c") : secret "metrics-daemon-secret" not found Apr 20 20:08:15.992675 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:15.992644 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7t5kf_a545fd66-1350-4958-93ed-a017391ba8d7/node-ca/0.log" Apr 20 20:08:16.114744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:16.114715 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:08:16.115160 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:16.115086 2579 scope.go:117] "RemoveContainer" containerID="d6ee725b9cf79ac119c2c8e58b8d8d8eefd20a65a49b9ed8a873fd5370e7a0d4" Apr 20 20:08:16.115261 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:16.115242 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-69djd_openshift-console-operator(fc299172-ad2d-4bef-a2b2-de75054e20b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" podUID="fc299172-ad2d-4bef-a2b2-de75054e20b5" Apr 20 20:08:17.673102 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.673063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:17.673512 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:17.673204 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:17.673512 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:17.673273 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls podName:c79c4964-96fa-4fde-9a9d-7928a8841145 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:25.673256738 +0000 UTC m=+140.520799756 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pfg5q" (UID: "c79c4964-96fa-4fde-9a9d-7928a8841145") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:17.773853 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.773813 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:17.774064 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:17.773985 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 20 20:08:17.774064 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:17.774052 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls podName:17c337e8-f179-4bd2-b623-1741863e9278 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:25.774037983 +0000 UTC m=+140.621581002 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-n4p5j" (UID: "17c337e8-f179-4bd2-b623-1741863e9278") : secret "samples-operator-tls" not found Apr 20 20:08:17.953455 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.953365 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g2nhx"] Apr 20 20:08:17.955639 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.955622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:17.958314 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.958288 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-jd2tz\"" Apr 20 20:08:17.958442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.958358 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 20:08:17.959277 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.959260 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 20:08:17.959368 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.959264 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 20:08:17.959508 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.959495 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 20:08:17.966688 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.966664 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g2nhx"] Apr 20 20:08:17.975198 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.975164 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7779b666-77ce-44f1-be0b-6510e2ec763f-signing-key\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:17.975321 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.975243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7779b666-77ce-44f1-be0b-6510e2ec763f-signing-cabundle\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:17.975321 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:17.975276 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9ccw\" (UniqueName: \"kubernetes.io/projected/7779b666-77ce-44f1-be0b-6510e2ec763f-kube-api-access-r9ccw\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.076002 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.075963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7779b666-77ce-44f1-be0b-6510e2ec763f-signing-cabundle\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.076002 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.076000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9ccw\" (UniqueName: \"kubernetes.io/projected/7779b666-77ce-44f1-be0b-6510e2ec763f-kube-api-access-r9ccw\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.076258 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.076075 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7779b666-77ce-44f1-be0b-6510e2ec763f-signing-key\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.076682 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.076659 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7779b666-77ce-44f1-be0b-6510e2ec763f-signing-cabundle\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.078716 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.078693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7779b666-77ce-44f1-be0b-6510e2ec763f-signing-key\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.085521 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.085496 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9ccw\" (UniqueName: \"kubernetes.io/projected/7779b666-77ce-44f1-be0b-6510e2ec763f-kube-api-access-r9ccw\") pod \"service-ca-865cb79987-g2nhx\" (UID: \"7779b666-77ce-44f1-be0b-6510e2ec763f\") " pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.264495 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.264462 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g2nhx" Apr 20 20:08:18.396140 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:18.396099 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g2nhx"] Apr 20 20:08:18.400349 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:18.400320 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7779b666_77ce_44f1_be0b_6510e2ec763f.slice/crio-9382d5eb4478e17c064a85ebb0433a2c7d052526e059422e7c404d9154b0681d WatchSource:0}: Error finding container 9382d5eb4478e17c064a85ebb0433a2c7d052526e059422e7c404d9154b0681d: Status 404 returned error can't find the container with id 9382d5eb4478e17c064a85ebb0433a2c7d052526e059422e7c404d9154b0681d Apr 20 20:08:19.124602 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:19.124562 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g2nhx" event={"ID":"7779b666-77ce-44f1-be0b-6510e2ec763f","Type":"ContainerStarted","Data":"76f72dafd79f98cdd3340cfb28ee70f8037827c2b1636ce3d0913c82d16c9eec"} Apr 20 20:08:19.124602 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:19.124604 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g2nhx" event={"ID":"7779b666-77ce-44f1-be0b-6510e2ec763f","Type":"ContainerStarted","Data":"9382d5eb4478e17c064a85ebb0433a2c7d052526e059422e7c404d9154b0681d"} Apr 20 20:08:20.338473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:20.338431 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:20.338473 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:20.338481 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:20.339126 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:20.339094 2579 scope.go:117] "RemoveContainer" containerID="d6ee725b9cf79ac119c2c8e58b8d8d8eefd20a65a49b9ed8a873fd5370e7a0d4" Apr 20 20:08:20.339336 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:20.339315 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-69djd_openshift-console-operator(fc299172-ad2d-4bef-a2b2-de75054e20b5)\"" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" podUID="fc299172-ad2d-4bef-a2b2-de75054e20b5" Apr 20 20:08:25.732309 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:25.732266 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:25.732736 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:25.732417 2579 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:25.732736 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:25.732486 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls podName:c79c4964-96fa-4fde-9a9d-7928a8841145 nodeName:}" failed. No retries permitted until 2026-04-20 20:08:41.73247147 +0000 UTC m=+156.580014488 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-pfg5q" (UID: "c79c4964-96fa-4fde-9a9d-7928a8841145") : secret "cluster-monitoring-operator-tls" not found Apr 20 20:08:25.833155 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:25.833114 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:25.835777 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:25.835744 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/17c337e8-f179-4bd2-b623-1741863e9278-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-n4p5j\" (UID: \"17c337e8-f179-4bd2-b623-1741863e9278\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:25.849258 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:25.849222 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" Apr 20 20:08:25.986975 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:25.986834 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-g2nhx" podStartSLOduration=8.986814518 podStartE2EDuration="8.986814518s" podCreationTimestamp="2026-04-20 20:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:19.143921463 +0000 UTC m=+133.991464505" watchObservedRunningTime="2026-04-20 20:08:25.986814518 +0000 UTC m=+140.834357560" Apr 20 20:08:25.987582 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:25.987560 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j"] Apr 20 20:08:26.143614 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:26.143577 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" event={"ID":"17c337e8-f179-4bd2-b623-1741863e9278","Type":"ContainerStarted","Data":"64ec2c6db8d63a50ad4081cf609f9a9fc434e2928ae4f555879e4f92496f83d5"} Apr 20 20:08:28.150351 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:28.150242 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" event={"ID":"17c337e8-f179-4bd2-b623-1741863e9278","Type":"ContainerStarted","Data":"cfadf5940853fad83c6be37deb2d2513aa61553024aa555efd155ab6923ef4f9"} Apr 20 20:08:28.150351 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:28.150292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" event={"ID":"17c337e8-f179-4bd2-b623-1741863e9278","Type":"ContainerStarted","Data":"a2e003b9d7a8b7626dcb0a86eac931df6378889859a1a9a5c280823c4a8275b0"} Apr 20 20:08:28.172917 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:28.172846 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-n4p5j" podStartSLOduration=17.440024986 podStartE2EDuration="19.172828016s" podCreationTimestamp="2026-04-20 20:08:09 +0000 UTC" firstStartedPulling="2026-04-20 20:08:26.039584408 +0000 UTC m=+140.887127427" lastFinishedPulling="2026-04-20 20:08:27.772387425 +0000 UTC m=+142.619930457" observedRunningTime="2026-04-20 20:08:28.170939081 +0000 UTC m=+143.018482121" watchObservedRunningTime="2026-04-20 20:08:28.172828016 +0000 UTC m=+143.020371056" Apr 20 20:08:32.731431 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:32.731397 2579 scope.go:117] "RemoveContainer" containerID="d6ee725b9cf79ac119c2c8e58b8d8d8eefd20a65a49b9ed8a873fd5370e7a0d4" Apr 20 20:08:33.164339 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:33.164257 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:08:33.164483 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:33.164346 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" event={"ID":"fc299172-ad2d-4bef-a2b2-de75054e20b5","Type":"ContainerStarted","Data":"c5a39ccb62d72ab00d8518d1618762d47396fee96630e941da73e53e0267ddee"} Apr 20 20:08:33.164678 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:33.164655 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:33.193289 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:33.193233 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" podStartSLOduration=20.322669891 podStartE2EDuration="23.193218881s" podCreationTimestamp="2026-04-20 20:08:10 +0000 UTC" firstStartedPulling="2026-04-20 20:08:10.704575535 +0000 UTC m=+125.552118554" lastFinishedPulling="2026-04-20 20:08:13.575124522 +0000 UTC m=+128.422667544" observedRunningTime="2026-04-20 20:08:33.192232641 +0000 UTC m=+148.039775683" watchObservedRunningTime="2026-04-20 20:08:33.193218881 +0000 UTC m=+148.040761968" Apr 20 20:08:33.365165 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:33.365135 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-69djd" Apr 20 20:08:38.418386 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.418346 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wwct6"] Apr 20 20:08:38.420518 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.420497 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.423866 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.423844 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-k8gjs\"" Apr 20 20:08:38.424631 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.424612 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 20:08:38.425931 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.425851 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gdfgp"] Apr 20 20:08:38.427044 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.427023 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 20:08:38.427965 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.427947 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.431641 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.431619 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 20:08:38.431842 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.431828 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jp4bq\"" Apr 20 20:08:38.431944 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.431924 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 20:08:38.431996 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.431977 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 20:08:38.432047 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.432015 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 20:08:38.449795 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.449754 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wwct6"] Apr 20 20:08:38.467411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.467382 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gdfgp"] Apr 20 20:08:38.523834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.523797 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-2fvv4"] Apr 20 20:08:38.526054 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.526038 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-2fvv4" Apr 20 20:08:38.534183 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534164 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-r5vzk\"" Apr 20 20:08:38.534294 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534219 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 20:08:38.534294 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534284 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 20:08:38.534387 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zsf8\" (UniqueName: \"kubernetes.io/projected/babd65b4-ae72-4241-ac04-d1773d2684e6-kube-api-access-6zsf8\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.534449 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/babd65b4-ae72-4241-ac04-d1773d2684e6-crio-socket\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.534494 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/babd65b4-ae72-4241-ac04-d1773d2684e6-data-volume\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.534543 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534505 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/babd65b4-ae72-4241-ac04-d1773d2684e6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.534543 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/157e2b24-16fe-402b-ad88-6a65a8c662f5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wwct6\" (UID: \"157e2b24-16fe-402b-ad88-6a65a8c662f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.534641 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/babd65b4-ae72-4241-ac04-d1773d2684e6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.534641 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.534603 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/157e2b24-16fe-402b-ad88-6a65a8c662f5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wwct6\" (UID: \"157e2b24-16fe-402b-ad88-6a65a8c662f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.538311 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.538291 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6598fdf965-7nq6v"] Apr 20 20:08:38.540173 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.540159 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.543126 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.543103 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 20:08:38.543662 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.543648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 20:08:38.543969 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.543953 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-dkw75\"" Apr 20 20:08:38.544391 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.544374 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 20:08:38.553022 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.552996 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 20:08:38.553835 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.553815 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-2fvv4"] Apr 20 20:08:38.583201 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.583163 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6598fdf965-7nq6v"] Apr 20 20:08:38.635281 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635236 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zsf8\" (UniqueName: \"kubernetes.io/projected/babd65b4-ae72-4241-ac04-d1773d2684e6-kube-api-access-6zsf8\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.635281 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635280 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/609404ae-d83c-4bce-97c6-0c193b6b0485-installation-pull-secrets\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.635562 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/babd65b4-ae72-4241-ac04-d1773d2684e6-crio-socket\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.635562 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635333 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/babd65b4-ae72-4241-ac04-d1773d2684e6-data-volume\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.635562 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635353 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/babd65b4-ae72-4241-ac04-d1773d2684e6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.635562 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635382 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-bound-sa-token\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.635562 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/babd65b4-ae72-4241-ac04-d1773d2684e6-crio-socket\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.635562 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/157e2b24-16fe-402b-ad88-6a65a8c662f5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wwct6\" (UID: \"157e2b24-16fe-402b-ad88-6a65a8c662f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.635863 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/609404ae-d83c-4bce-97c6-0c193b6b0485-trusted-ca\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.635863 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635714 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/babd65b4-ae72-4241-ac04-d1773d2684e6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.635863 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/babd65b4-ae72-4241-ac04-d1773d2684e6-data-volume\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.635863 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635750 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-registry-tls\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.635863 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635818 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/157e2b24-16fe-402b-ad88-6a65a8c662f5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wwct6\" (UID: \"157e2b24-16fe-402b-ad88-6a65a8c662f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.635863 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/609404ae-d83c-4bce-97c6-0c193b6b0485-image-registry-private-configuration\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.636218 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/609404ae-d83c-4bce-97c6-0c193b6b0485-registry-certificates\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.636218 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.635981 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72sbc\" (UniqueName: \"kubernetes.io/projected/c52fae83-c67e-4e0e-a29f-7df47f11231f-kube-api-access-72sbc\") pod \"downloads-6bcc868b7-2fvv4\" (UID: \"c52fae83-c67e-4e0e-a29f-7df47f11231f\") " pod="openshift-console/downloads-6bcc868b7-2fvv4" Apr 20 20:08:38.636218 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.636011 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98dd\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-kube-api-access-m98dd\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.636218 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.636075 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/609404ae-d83c-4bce-97c6-0c193b6b0485-ca-trust-extracted\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.636510 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.636493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/157e2b24-16fe-402b-ad88-6a65a8c662f5-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-wwct6\" (UID: \"157e2b24-16fe-402b-ad88-6a65a8c662f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.636580 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.636561 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/babd65b4-ae72-4241-ac04-d1773d2684e6-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.638137 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.638108 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/babd65b4-ae72-4241-ac04-d1773d2684e6-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.638495 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.638478 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/157e2b24-16fe-402b-ad88-6a65a8c662f5-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-wwct6\" (UID: \"157e2b24-16fe-402b-ad88-6a65a8c662f5\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.648206 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.648168 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zsf8\" (UniqueName: \"kubernetes.io/projected/babd65b4-ae72-4241-ac04-d1773d2684e6-kube-api-access-6zsf8\") pod \"insights-runtime-extractor-gdfgp\" (UID: \"babd65b4-ae72-4241-ac04-d1773d2684e6\") " pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.729446 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.729411 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" Apr 20 20:08:38.736355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736316 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gdfgp" Apr 20 20:08:38.736526 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736475 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/609404ae-d83c-4bce-97c6-0c193b6b0485-installation-pull-secrets\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.736595 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-bound-sa-token\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.736652 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736587 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/609404ae-d83c-4bce-97c6-0c193b6b0485-trusted-ca\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.736652 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-registry-tls\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.736752 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736679 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/609404ae-d83c-4bce-97c6-0c193b6b0485-image-registry-private-configuration\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.736752 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/609404ae-d83c-4bce-97c6-0c193b6b0485-registry-certificates\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.736851 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736769 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72sbc\" (UniqueName: \"kubernetes.io/projected/c52fae83-c67e-4e0e-a29f-7df47f11231f-kube-api-access-72sbc\") pod \"downloads-6bcc868b7-2fvv4\" (UID: \"c52fae83-c67e-4e0e-a29f-7df47f11231f\") " pod="openshift-console/downloads-6bcc868b7-2fvv4" Apr 20 20:08:38.736851 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m98dd\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-kube-api-access-m98dd\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.736851 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.736833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/609404ae-d83c-4bce-97c6-0c193b6b0485-ca-trust-extracted\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.737286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.737212 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/609404ae-d83c-4bce-97c6-0c193b6b0485-ca-trust-extracted\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.738002 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.737978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/609404ae-d83c-4bce-97c6-0c193b6b0485-registry-certificates\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.739621 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.739560 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/609404ae-d83c-4bce-97c6-0c193b6b0485-trusted-ca\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.739835 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.739796 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/609404ae-d83c-4bce-97c6-0c193b6b0485-installation-pull-secrets\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.741080 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.741052 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/609404ae-d83c-4bce-97c6-0c193b6b0485-image-registry-private-configuration\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.741080 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.741079 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-registry-tls\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.750913 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.750858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-bound-sa-token\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.751064 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.751030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98dd\" (UniqueName: \"kubernetes.io/projected/609404ae-d83c-4bce-97c6-0c193b6b0485-kube-api-access-m98dd\") pod \"image-registry-6598fdf965-7nq6v\" (UID: \"609404ae-d83c-4bce-97c6-0c193b6b0485\") " pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.751242 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.751224 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72sbc\" (UniqueName: \"kubernetes.io/projected/c52fae83-c67e-4e0e-a29f-7df47f11231f-kube-api-access-72sbc\") pod \"downloads-6bcc868b7-2fvv4\" (UID: \"c52fae83-c67e-4e0e-a29f-7df47f11231f\") " pod="openshift-console/downloads-6bcc868b7-2fvv4" Apr 20 20:08:38.834195 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.834161 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-2fvv4" Apr 20 20:08:38.848938 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.848900 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:38.877391 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.877316 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-wwct6"] Apr 20 20:08:38.880201 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:38.880160 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod157e2b24_16fe_402b_ad88_6a65a8c662f5.slice/crio-b2adcb218f50c9d35858061dfdee15e647d433fdb99d416a35c4fd629293c4fb WatchSource:0}: Error finding container b2adcb218f50c9d35858061dfdee15e647d433fdb99d416a35c4fd629293c4fb: Status 404 returned error can't find the container with id b2adcb218f50c9d35858061dfdee15e647d433fdb99d416a35c4fd629293c4fb Apr 20 20:08:38.901960 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:38.901924 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gdfgp"] Apr 20 20:08:38.907382 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:38.907351 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbabd65b4_ae72_4241_ac04_d1773d2684e6.slice/crio-f923414f7a247c60611e57df8174730aea21decdb1abcbdab7d6495fba98b6ca WatchSource:0}: Error finding container f923414f7a247c60611e57df8174730aea21decdb1abcbdab7d6495fba98b6ca: Status 404 returned error can't find the container with id f923414f7a247c60611e57df8174730aea21decdb1abcbdab7d6495fba98b6ca Apr 20 20:08:39.006241 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.006210 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-2fvv4"] Apr 20 20:08:39.009511 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:39.009388 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc52fae83_c67e_4e0e_a29f_7df47f11231f.slice/crio-e512e70f805443ae0861c5752a3780fb82725439fbb2a7c04aa9cebfdc141722 WatchSource:0}: Error finding container e512e70f805443ae0861c5752a3780fb82725439fbb2a7c04aa9cebfdc141722: Status 404 returned error can't find the container with id e512e70f805443ae0861c5752a3780fb82725439fbb2a7c04aa9cebfdc141722 Apr 20 20:08:39.030995 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.030762 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6598fdf965-7nq6v"] Apr 20 20:08:39.035342 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:39.035291 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod609404ae_d83c_4bce_97c6_0c193b6b0485.slice/crio-17a4903596641912774a594b8c974016016326e0b4e7ba0c2936dacd59716caf WatchSource:0}: Error finding container 17a4903596641912774a594b8c974016016326e0b4e7ba0c2936dacd59716caf: Status 404 returned error can't find the container with id 17a4903596641912774a594b8c974016016326e0b4e7ba0c2936dacd59716caf Apr 20 20:08:39.180767 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.180724 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" event={"ID":"609404ae-d83c-4bce-97c6-0c193b6b0485","Type":"ContainerStarted","Data":"c1baa6a00a53bffa31fbcb479f127b2955970413faffbce625aa10452a6058f8"} Apr 20 20:08:39.180767 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.180771 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" event={"ID":"609404ae-d83c-4bce-97c6-0c193b6b0485","Type":"ContainerStarted","Data":"17a4903596641912774a594b8c974016016326e0b4e7ba0c2936dacd59716caf"} Apr 20 20:08:39.181031 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.180839 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:08:39.181912 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.181860 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" event={"ID":"157e2b24-16fe-402b-ad88-6a65a8c662f5","Type":"ContainerStarted","Data":"b2adcb218f50c9d35858061dfdee15e647d433fdb99d416a35c4fd629293c4fb"} Apr 20 20:08:39.182866 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.182838 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-2fvv4" event={"ID":"c52fae83-c67e-4e0e-a29f-7df47f11231f","Type":"ContainerStarted","Data":"e512e70f805443ae0861c5752a3780fb82725439fbb2a7c04aa9cebfdc141722"} Apr 20 20:08:39.184023 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.184002 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gdfgp" event={"ID":"babd65b4-ae72-4241-ac04-d1773d2684e6","Type":"ContainerStarted","Data":"bdfeefb87c5dce13895142f06c257b90512c1adc4c7d96d692ffde13bb01d90b"} Apr 20 20:08:39.184111 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.184027 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gdfgp" event={"ID":"babd65b4-ae72-4241-ac04-d1773d2684e6","Type":"ContainerStarted","Data":"f923414f7a247c60611e57df8174730aea21decdb1abcbdab7d6495fba98b6ca"} Apr 20 20:08:39.202574 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:39.202518 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" podStartSLOduration=1.202503374 podStartE2EDuration="1.202503374s" podCreationTimestamp="2026-04-20 20:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:08:39.201675423 +0000 UTC m=+154.049218468" watchObservedRunningTime="2026-04-20 20:08:39.202503374 +0000 UTC m=+154.050046392" Apr 20 20:08:40.188695 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:40.188607 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" event={"ID":"157e2b24-16fe-402b-ad88-6a65a8c662f5","Type":"ContainerStarted","Data":"53f8bd8f4d286d3c8954fa5649415669bbb3b7cd21d3e46da518de9ae9881531"} Apr 20 20:08:40.190537 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:40.190505 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gdfgp" event={"ID":"babd65b4-ae72-4241-ac04-d1773d2684e6","Type":"ContainerStarted","Data":"e7d462a441849c53ac2025bba531df87cf263ebbe7abd13a1402cd65627278ec"} Apr 20 20:08:40.206957 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:40.206659 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-wwct6" podStartSLOduration=1.216081732 podStartE2EDuration="2.206641057s" podCreationTimestamp="2026-04-20 20:08:38 +0000 UTC" firstStartedPulling="2026-04-20 20:08:38.882747134 +0000 UTC m=+153.730290153" lastFinishedPulling="2026-04-20 20:08:39.873306444 +0000 UTC m=+154.720849478" observedRunningTime="2026-04-20 20:08:40.206238491 +0000 UTC m=+155.053781533" watchObservedRunningTime="2026-04-20 20:08:40.206641057 +0000 UTC m=+155.054184100" Apr 20 20:08:41.773521 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:41.773431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:41.776336 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:41.776309 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/c79c4964-96fa-4fde-9a9d-7928a8841145-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-pfg5q\" (UID: \"c79c4964-96fa-4fde-9a9d-7928a8841145\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:42.034975 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:42.034889 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" Apr 20 20:08:42.172370 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:42.172323 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q"] Apr 20 20:08:42.176230 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:42.176198 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc79c4964_96fa_4fde_9a9d_7928a8841145.slice/crio-5550f38c550060e7431b10fe1514f047d7df66d9ba56499bba9c8e61703069e4 WatchSource:0}: Error finding container 5550f38c550060e7431b10fe1514f047d7df66d9ba56499bba9c8e61703069e4: Status 404 returned error can't find the container with id 5550f38c550060e7431b10fe1514f047d7df66d9ba56499bba9c8e61703069e4 Apr 20 20:08:42.198551 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:42.198515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gdfgp" event={"ID":"babd65b4-ae72-4241-ac04-d1773d2684e6","Type":"ContainerStarted","Data":"0fb6153bdeb4149ba28b2ebc6e1a81448b8dfdee972396a0587662c924b1428a"} Apr 20 20:08:42.199460 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:42.199440 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" event={"ID":"c79c4964-96fa-4fde-9a9d-7928a8841145","Type":"ContainerStarted","Data":"5550f38c550060e7431b10fe1514f047d7df66d9ba56499bba9c8e61703069e4"} Apr 20 20:08:42.217412 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:42.217344 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gdfgp" podStartSLOduration=1.7289890049999999 podStartE2EDuration="4.217320899s" podCreationTimestamp="2026-04-20 20:08:38 +0000 UTC" firstStartedPulling="2026-04-20 20:08:39.01891627 +0000 UTC m=+153.866459289" lastFinishedPulling="2026-04-20 20:08:41.507248143 +0000 UTC m=+156.354791183" observedRunningTime="2026-04-20 20:08:42.216466329 +0000 UTC m=+157.064009371" watchObservedRunningTime="2026-04-20 20:08:42.217320899 +0000 UTC m=+157.064863953" Apr 20 20:08:42.595773 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:42.595722 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-54k7f" podUID="a614e742-616a-48e4-bb64-1c023ed6fecf" Apr 20 20:08:42.600956 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:42.600906 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-zq94z" podUID="ac55a3c1-11b3-4209-b73c-91b790e26c17" Apr 20 20:08:43.203301 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:43.202805 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:08:43.203301 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:43.202995 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-54k7f" Apr 20 20:08:43.752686 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:43.752634 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-g7wqd" podUID="5020f6ce-7061-484f-9b7d-89141a36e42c" Apr 20 20:08:44.208298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.208214 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" event={"ID":"c79c4964-96fa-4fde-9a9d-7928a8841145","Type":"ContainerStarted","Data":"81caf05de0c6206834f78edcea5af46a27ab5cbbb91ec72090e77b07adb92f52"} Apr 20 20:08:44.229050 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.228987 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-pfg5q" podStartSLOduration=33.500604795 podStartE2EDuration="35.228967707s" podCreationTimestamp="2026-04-20 20:08:09 +0000 UTC" firstStartedPulling="2026-04-20 20:08:42.178829695 +0000 UTC m=+157.026372715" lastFinishedPulling="2026-04-20 20:08:43.907192593 +0000 UTC m=+158.754735627" observedRunningTime="2026-04-20 20:08:44.227615259 +0000 UTC m=+159.075158326" watchObservedRunningTime="2026-04-20 20:08:44.228967707 +0000 UTC m=+159.076510749" Apr 20 20:08:44.490152 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.490116 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd"] Apr 20 20:08:44.494791 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.494765 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:44.498215 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.498184 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 20:08:44.499285 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.499259 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-85sm7\"" Apr 20 20:08:44.504369 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.504344 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd"] Apr 20 20:08:44.601633 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.601591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f139f953-09ab-4649-bfd3-df6122181f1a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qbjkd\" (UID: \"f139f953-09ab-4649-bfd3-df6122181f1a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:44.702593 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:44.702547 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f139f953-09ab-4649-bfd3-df6122181f1a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qbjkd\" (UID: \"f139f953-09ab-4649-bfd3-df6122181f1a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:44.702796 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:44.702733 2579 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 20:08:44.702861 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:44.702814 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f139f953-09ab-4649-bfd3-df6122181f1a-tls-certificates podName:f139f953-09ab-4649-bfd3-df6122181f1a nodeName:}" failed. No retries permitted until 2026-04-20 20:08:45.20278962 +0000 UTC m=+160.050332644 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/f139f953-09ab-4649-bfd3-df6122181f1a-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-qbjkd" (UID: "f139f953-09ab-4649-bfd3-df6122181f1a") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 20:08:45.206426 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:45.206384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f139f953-09ab-4649-bfd3-df6122181f1a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qbjkd\" (UID: \"f139f953-09ab-4649-bfd3-df6122181f1a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:45.209749 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:45.209710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/f139f953-09ab-4649-bfd3-df6122181f1a-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-qbjkd\" (UID: \"f139f953-09ab-4649-bfd3-df6122181f1a\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:45.407894 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:45.407836 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:45.552419 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:45.552381 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd"] Apr 20 20:08:45.558539 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:45.558503 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf139f953_09ab_4649_bfd3_df6122181f1a.slice/crio-a4535602f57f049cc93d86db9a439699a0d8cd93c123865a82e875af9bb48749 WatchSource:0}: Error finding container a4535602f57f049cc93d86db9a439699a0d8cd93c123865a82e875af9bb48749: Status 404 returned error can't find the container with id a4535602f57f049cc93d86db9a439699a0d8cd93c123865a82e875af9bb48749 Apr 20 20:08:46.216097 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:46.216050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" event={"ID":"f139f953-09ab-4649-bfd3-df6122181f1a","Type":"ContainerStarted","Data":"a4535602f57f049cc93d86db9a439699a0d8cd93c123865a82e875af9bb48749"} Apr 20 20:08:47.220302 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.220201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" event={"ID":"f139f953-09ab-4649-bfd3-df6122181f1a","Type":"ContainerStarted","Data":"09e23a40e79645c749ee267bc7863f04966a4481f667208d55df25e4324df190"} Apr 20 20:08:47.220713 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.220370 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:47.225966 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.225937 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" Apr 20 20:08:47.241126 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.241057 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-qbjkd" podStartSLOduration=2.002809842 podStartE2EDuration="3.241036225s" podCreationTimestamp="2026-04-20 20:08:44 +0000 UTC" firstStartedPulling="2026-04-20 20:08:45.560966314 +0000 UTC m=+160.408509335" lastFinishedPulling="2026-04-20 20:08:46.799192687 +0000 UTC m=+161.646735718" observedRunningTime="2026-04-20 20:08:47.238975919 +0000 UTC m=+162.086518960" watchObservedRunningTime="2026-04-20 20:08:47.241036225 +0000 UTC m=+162.088579265" Apr 20 20:08:47.529105 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.529051 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:08:47.529105 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.529107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:08:47.532258 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.532229 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a614e742-616a-48e4-bb64-1c023ed6fecf-metrics-tls\") pod \"dns-default-54k7f\" (UID: \"a614e742-616a-48e4-bb64-1c023ed6fecf\") " pod="openshift-dns/dns-default-54k7f" Apr 20 20:08:47.532449 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.532426 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ac55a3c1-11b3-4209-b73c-91b790e26c17-cert\") pod \"ingress-canary-zq94z\" (UID: \"ac55a3c1-11b3-4209-b73c-91b790e26c17\") " pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:08:47.706329 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.706293 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhk8x\"" Apr 20 20:08:47.706980 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.706948 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xwtbs\"" Apr 20 20:08:47.713748 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.713710 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-zq94z" Apr 20 20:08:47.713931 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.713724 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-54k7f" Apr 20 20:08:47.882402 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.882006 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-zq94z"] Apr 20 20:08:47.885258 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:47.885209 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac55a3c1_11b3_4209_b73c_91b790e26c17.slice/crio-e0e30714ed7b365514139928a2b99b91e658fb79d8d703d2f90201e3c35bf5c3 WatchSource:0}: Error finding container e0e30714ed7b365514139928a2b99b91e658fb79d8d703d2f90201e3c35bf5c3: Status 404 returned error can't find the container with id e0e30714ed7b365514139928a2b99b91e658fb79d8d703d2f90201e3c35bf5c3 Apr 20 20:08:47.901460 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:47.901426 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-54k7f"] Apr 20 20:08:47.906260 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:47.906222 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda614e742_616a_48e4_bb64_1c023ed6fecf.slice/crio-ef2202e6ae737cb0f270280724c414fde5a06a09d5ec455bdfbfe23d23088704 WatchSource:0}: Error finding container ef2202e6ae737cb0f270280724c414fde5a06a09d5ec455bdfbfe23d23088704: Status 404 returned error can't find the container with id ef2202e6ae737cb0f270280724c414fde5a06a09d5ec455bdfbfe23d23088704 Apr 20 20:08:48.224678 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:48.224430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-54k7f" event={"ID":"a614e742-616a-48e4-bb64-1c023ed6fecf","Type":"ContainerStarted","Data":"ef2202e6ae737cb0f270280724c414fde5a06a09d5ec455bdfbfe23d23088704"} Apr 20 20:08:48.225784 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:48.225755 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zq94z" event={"ID":"ac55a3c1-11b3-4209-b73c-91b790e26c17","Type":"ContainerStarted","Data":"e0e30714ed7b365514139928a2b99b91e658fb79d8d703d2f90201e3c35bf5c3"} Apr 20 20:08:51.920977 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:51.920831 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-m5wpz"] Apr 20 20:08:51.924915 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:51.924863 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:51.928503 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:51.928474 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-7n892\"" Apr 20 20:08:51.929310 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:51.928797 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 20:08:51.929310 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:51.928859 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 20:08:51.929310 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:51.928859 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 20:08:51.929423 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:51.929343 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067562 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7242252-d37d-4f22-8f26-227293d15aed-metrics-client-ca\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067618 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067668 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-accelerators-collector-config\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-tls\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067755 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-textfile\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067783 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvh82\" (UniqueName: \"kubernetes.io/projected/e7242252-d37d-4f22-8f26-227293d15aed-kube-api-access-qvh82\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067816 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-sys\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067844 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-wtmp\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.068040 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.067915 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-root\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.168790 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.168747 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-wtmp\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.168825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-root\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.168856 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7242252-d37d-4f22-8f26-227293d15aed-metrics-client-ca\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.168907 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.168964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-accelerators-collector-config\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.169012 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-tls\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169267 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.169045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-textfile\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169267 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.169072 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvh82\" (UniqueName: \"kubernetes.io/projected/e7242252-d37d-4f22-8f26-227293d15aed-kube-api-access-qvh82\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.169267 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.169110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-sys\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.170383 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.169192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-sys\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.170383 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.170000 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-root\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.170383 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.170178 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-wtmp\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.170383 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.170312 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-accelerators-collector-config\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.172281 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.170987 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-textfile\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.172281 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.170950 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e7242252-d37d-4f22-8f26-227293d15aed-metrics-client-ca\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.172281 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:52.171237 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 20:08:52.172281 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:08:52.171323 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-tls podName:e7242252-d37d-4f22-8f26-227293d15aed nodeName:}" failed. No retries permitted until 2026-04-20 20:08:52.671299913 +0000 UTC m=+167.518842937 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-tls") pod "node-exporter-m5wpz" (UID: "e7242252-d37d-4f22-8f26-227293d15aed") : secret "node-exporter-tls" not found Apr 20 20:08:52.180603 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.180037 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.180784 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.180703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvh82\" (UniqueName: \"kubernetes.io/projected/e7242252-d37d-4f22-8f26-227293d15aed-kube-api-access-qvh82\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.674795 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.674729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-tls\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.677677 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.677645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e7242252-d37d-4f22-8f26-227293d15aed-node-exporter-tls\") pod \"node-exporter-m5wpz\" (UID: \"e7242252-d37d-4f22-8f26-227293d15aed\") " pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:52.838981 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:52.838932 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-m5wpz" Apr 20 20:08:54.983581 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:54.983543 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-668f954756-5c6pq"] Apr 20 20:08:54.999774 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:54.999696 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.002924 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.002889 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-8fp22\"" Apr 20 20:08:55.004330 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.003072 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 20:08:55.004330 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.003289 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 20:08:55.004330 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.003396 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-668f954756-5c6pq"] Apr 20 20:08:55.004330 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.003538 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bt22p66869ur\"" Apr 20 20:08:55.004330 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.003720 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 20:08:55.004330 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.004018 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 20:08:55.004330 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.004220 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 20:08:55.098110 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098071 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-tls\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.098303 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098118 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.098303 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098181 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.098303 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.098303 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098249 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df8856cf-ad0c-428d-936d-9ac93ef7d700-metrics-client-ca\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.098303 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098280 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-grpc-tls\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.098563 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098411 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7l8h\" (UniqueName: \"kubernetes.io/projected/df8856cf-ad0c-428d-936d-9ac93ef7d700-kube-api-access-h7l8h\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.098563 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.098462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199332 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7l8h\" (UniqueName: \"kubernetes.io/projected/df8856cf-ad0c-428d-936d-9ac93ef7d700-kube-api-access-h7l8h\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199533 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199343 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199533 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-tls\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199533 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199533 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199464 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199533 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199496 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199533 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199531 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df8856cf-ad0c-428d-936d-9ac93ef7d700-metrics-client-ca\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.199808 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.199562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-grpc-tls\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.202475 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.202115 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df8856cf-ad0c-428d-936d-9ac93ef7d700-metrics-client-ca\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.204475 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.204393 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.205135 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.205105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-grpc-tls\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.206254 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.206194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-tls\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.206254 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.206194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.206666 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.206639 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.207445 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.207397 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/df8856cf-ad0c-428d-936d-9ac93ef7d700-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.211406 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.211379 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7l8h\" (UniqueName: \"kubernetes.io/projected/df8856cf-ad0c-428d-936d-9ac93ef7d700-kube-api-access-h7l8h\") pod \"thanos-querier-668f954756-5c6pq\" (UID: \"df8856cf-ad0c-428d-936d-9ac93ef7d700\") " pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.316730 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.316632 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:08:55.735540 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:55.735502 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:08:56.315620 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.315582 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-57958c75ff-6lnmd"] Apr 20 20:08:56.320298 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.320271 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.322778 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.322747 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-wg675\"" Apr 20 20:08:56.323595 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.323568 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 20:08:56.323690 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.323630 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 20:08:56.323915 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.323860 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-ea8a0e83lrp49\"" Apr 20 20:08:56.323915 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.323866 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 20:08:56.324134 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.323931 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 20:08:56.328178 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.328151 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57958c75ff-6lnmd"] Apr 20 20:08:56.410006 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.409958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-secret-metrics-server-tls\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.410006 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.410009 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4e5316e3-2107-49b1-b744-bd92a811e921-audit-log\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.410246 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.410076 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-client-ca-bundle\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.410246 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.410206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-secret-metrics-server-client-certs\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.410246 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.410231 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e5316e3-2107-49b1-b744-bd92a811e921-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.410363 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.410255 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4e5316e3-2107-49b1-b744-bd92a811e921-metrics-server-audit-profiles\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.410363 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.410306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmgtv\" (UniqueName: \"kubernetes.io/projected/4e5316e3-2107-49b1-b744-bd92a811e921-kube-api-access-wmgtv\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511182 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511139 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-secret-metrics-server-tls\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511375 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4e5316e3-2107-49b1-b744-bd92a811e921-audit-log\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511375 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-client-ca-bundle\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511375 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511334 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-secret-metrics-server-client-certs\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511375 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511361 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e5316e3-2107-49b1-b744-bd92a811e921-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511597 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511386 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4e5316e3-2107-49b1-b744-bd92a811e921-metrics-server-audit-profiles\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511597 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511426 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmgtv\" (UniqueName: \"kubernetes.io/projected/4e5316e3-2107-49b1-b744-bd92a811e921-kube-api-access-wmgtv\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.511751 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.511725 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4e5316e3-2107-49b1-b744-bd92a811e921-audit-log\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.512118 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.512095 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4e5316e3-2107-49b1-b744-bd92a811e921-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.512703 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.512676 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4e5316e3-2107-49b1-b744-bd92a811e921-metrics-server-audit-profiles\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.514254 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.514233 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-client-ca-bundle\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.514388 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.514368 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-secret-metrics-server-client-certs\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.514454 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.514369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4e5316e3-2107-49b1-b744-bd92a811e921-secret-metrics-server-tls\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.519562 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.519540 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmgtv\" (UniqueName: \"kubernetes.io/projected/4e5316e3-2107-49b1-b744-bd92a811e921-kube-api-access-wmgtv\") pod \"metrics-server-57958c75ff-6lnmd\" (UID: \"4e5316e3-2107-49b1-b744-bd92a811e921\") " pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.633719 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.633621 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:08:56.683640 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.683602 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd"] Apr 20 20:08:56.688216 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.688189 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" Apr 20 20:08:56.690684 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.690657 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-l68rj\"" Apr 20 20:08:56.690825 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.690699 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 20 20:08:56.693644 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.693611 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd"] Apr 20 20:08:56.814296 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.814258 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4031d84e-286b-453d-b9bf-80024787b990-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-n95gd\" (UID: \"4031d84e-286b-453d-b9bf-80024787b990\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" Apr 20 20:08:56.845481 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:56.845437 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7242252_d37d_4f22_8f26_227293d15aed.slice/crio-bca7f45b0742a338431fe50de9316a85173087fa85baa978e7fbdc9232aa428b WatchSource:0}: Error finding container bca7f45b0742a338431fe50de9316a85173087fa85baa978e7fbdc9232aa428b: Status 404 returned error can't find the container with id bca7f45b0742a338431fe50de9316a85173087fa85baa978e7fbdc9232aa428b Apr 20 20:08:56.916190 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.915296 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4031d84e-286b-453d-b9bf-80024787b990-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-n95gd\" (UID: \"4031d84e-286b-453d-b9bf-80024787b990\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" Apr 20 20:08:56.920647 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:56.919534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/4031d84e-286b-453d-b9bf-80024787b990-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-n95gd\" (UID: \"4031d84e-286b-453d-b9bf-80024787b990\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" Apr 20 20:08:57.001137 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.001061 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" Apr 20 20:08:57.014696 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.014658 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-57958c75ff-6lnmd"] Apr 20 20:08:57.023436 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:57.023381 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e5316e3_2107_49b1_b744_bd92a811e921.slice/crio-c9281b00f33eb9a91627964f305d5bdcd489b08b8b75da7a05dbdd6cbb315966 WatchSource:0}: Error finding container c9281b00f33eb9a91627964f305d5bdcd489b08b8b75da7a05dbdd6cbb315966: Status 404 returned error can't find the container with id c9281b00f33eb9a91627964f305d5bdcd489b08b8b75da7a05dbdd6cbb315966 Apr 20 20:08:57.051610 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.051468 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-668f954756-5c6pq"] Apr 20 20:08:57.058321 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:57.058074 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf8856cf_ad0c_428d_936d_9ac93ef7d700.slice/crio-c018d36f290ed32299d44000e2a3b805baf608dad1ca4cc4a57b9b1f813febc0 WatchSource:0}: Error finding container c018d36f290ed32299d44000e2a3b805baf608dad1ca4cc4a57b9b1f813febc0: Status 404 returned error can't find the container with id c018d36f290ed32299d44000e2a3b805baf608dad1ca4cc4a57b9b1f813febc0 Apr 20 20:08:57.123905 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.123734 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-66d56bfd8-82vtl"] Apr 20 20:08:57.133090 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.133049 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.140990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.137669 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-66d56bfd8-82vtl"] Apr 20 20:08:57.140990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.138021 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 20:08:57.140990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.139012 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 20:08:57.140990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.139345 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 20:08:57.140990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.139556 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 20:08:57.140990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.140019 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 20:08:57.141484 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.141111 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-czvsx\"" Apr 20 20:08:57.146862 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.146831 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 20:08:57.185272 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.185212 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd"] Apr 20 20:08:57.195404 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:57.195371 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4031d84e_286b_453d_b9bf_80024787b990.slice/crio-6335c0b0566fa15c0b3a0c4010bd4c7fecb5bbda09a5bf1d3b5d00dff7dad079 WatchSource:0}: Error finding container 6335c0b0566fa15c0b3a0c4010bd4c7fecb5bbda09a5bf1d3b5d00dff7dad079: Status 404 returned error can't find the container with id 6335c0b0566fa15c0b3a0c4010bd4c7fecb5bbda09a5bf1d3b5d00dff7dad079 Apr 20 20:08:57.218657 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-telemeter-client-tls\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.218804 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-metrics-client-ca\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.218804 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218759 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-secret-telemeter-client\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.218804 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218786 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-federate-client-tls\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.218982 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218815 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.218982 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218835 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.218982 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218858 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-serving-certs-ca-bundle\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.218982 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.218924 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rhq5\" (UniqueName: \"kubernetes.io/projected/1163c73f-65e4-431f-926c-dbbdb381939d-kube-api-access-2rhq5\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.259601 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.259555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-zq94z" event={"ID":"ac55a3c1-11b3-4209-b73c-91b790e26c17","Type":"ContainerStarted","Data":"753eec5662ca55b61eb0edda144ca034e9f5eaaa51d94fbcaacf60805eed7b64"} Apr 20 20:08:57.260954 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.260922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" event={"ID":"df8856cf-ad0c-428d-936d-9ac93ef7d700","Type":"ContainerStarted","Data":"c018d36f290ed32299d44000e2a3b805baf608dad1ca4cc4a57b9b1f813febc0"} Apr 20 20:08:57.262720 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.262688 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m5wpz" event={"ID":"e7242252-d37d-4f22-8f26-227293d15aed","Type":"ContainerStarted","Data":"bca7f45b0742a338431fe50de9316a85173087fa85baa978e7fbdc9232aa428b"} Apr 20 20:08:57.264097 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.264069 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" event={"ID":"4e5316e3-2107-49b1-b744-bd92a811e921","Type":"ContainerStarted","Data":"c9281b00f33eb9a91627964f305d5bdcd489b08b8b75da7a05dbdd6cbb315966"} Apr 20 20:08:57.265740 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.265714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-54k7f" event={"ID":"a614e742-616a-48e4-bb64-1c023ed6fecf","Type":"ContainerStarted","Data":"eac8cef49406368bf47fbbb578f8c8e47e7b1629aff20aac3a958e01c1d174ba"} Apr 20 20:08:57.265945 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.265924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-54k7f" event={"ID":"a614e742-616a-48e4-bb64-1c023ed6fecf","Type":"ContainerStarted","Data":"1dc637f6b3236ffd09e123a61aa4ca346e7f6eb5fab5b0963feab373cd9d8236"} Apr 20 20:08:57.266111 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.266101 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-54k7f" Apr 20 20:08:57.266966 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.266944 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" event={"ID":"4031d84e-286b-453d-b9bf-80024787b990","Type":"ContainerStarted","Data":"6335c0b0566fa15c0b3a0c4010bd4c7fecb5bbda09a5bf1d3b5d00dff7dad079"} Apr 20 20:08:57.268406 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.268377 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-2fvv4" event={"ID":"c52fae83-c67e-4e0e-a29f-7df47f11231f","Type":"ContainerStarted","Data":"3da482fb5931eaa8370a57506eef60aa9418a38cdb77fe171c022671b1a35f30"} Apr 20 20:08:57.268854 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.268833 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-2fvv4" Apr 20 20:08:57.276969 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.276906 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-zq94z" podStartSLOduration=129.319859054 podStartE2EDuration="2m18.276867976s" podCreationTimestamp="2026-04-20 20:06:39 +0000 UTC" firstStartedPulling="2026-04-20 20:08:47.888963669 +0000 UTC m=+162.736506703" lastFinishedPulling="2026-04-20 20:08:56.845972603 +0000 UTC m=+171.693515625" observedRunningTime="2026-04-20 20:08:57.27644001 +0000 UTC m=+172.123983051" watchObservedRunningTime="2026-04-20 20:08:57.276867976 +0000 UTC m=+172.124411021" Apr 20 20:08:57.278955 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.278894 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-2fvv4" Apr 20 20:08:57.295506 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.293743 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-2fvv4" podStartSLOduration=1.45310322 podStartE2EDuration="19.293726676s" podCreationTimestamp="2026-04-20 20:08:38 +0000 UTC" firstStartedPulling="2026-04-20 20:08:39.011690917 +0000 UTC m=+153.859233937" lastFinishedPulling="2026-04-20 20:08:56.852314369 +0000 UTC m=+171.699857393" observedRunningTime="2026-04-20 20:08:57.292397039 +0000 UTC m=+172.139940081" watchObservedRunningTime="2026-04-20 20:08:57.293726676 +0000 UTC m=+172.141269717" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320465 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rhq5\" (UniqueName: \"kubernetes.io/projected/1163c73f-65e4-431f-926c-dbbdb381939d-kube-api-access-2rhq5\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320548 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-telemeter-client-tls\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-metrics-client-ca\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-secret-telemeter-client\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-federate-client-tls\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320697 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320722 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.320768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-serving-certs-ca-bundle\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.322016 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.321557 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-serving-certs-ca-bundle\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.323063 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.322264 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-metrics-client-ca\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.323416 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.323367 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1163c73f-65e4-431f-926c-dbbdb381939d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.324462 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.324440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-secret-telemeter-client\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.325024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.324996 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-telemeter-client-tls\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.325832 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.325808 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.327276 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.327230 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/1163c73f-65e4-431f-926c-dbbdb381939d-federate-client-tls\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.329378 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.329323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rhq5\" (UniqueName: \"kubernetes.io/projected/1163c73f-65e4-431f-926c-dbbdb381939d-kube-api-access-2rhq5\") pod \"telemeter-client-66d56bfd8-82vtl\" (UID: \"1163c73f-65e4-431f-926c-dbbdb381939d\") " pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.338419 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.338086 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-54k7f" podStartSLOduration=129.401807007 podStartE2EDuration="2m18.338065627s" podCreationTimestamp="2026-04-20 20:06:39 +0000 UTC" firstStartedPulling="2026-04-20 20:08:47.908406823 +0000 UTC m=+162.755949842" lastFinishedPulling="2026-04-20 20:08:56.844665429 +0000 UTC m=+171.692208462" observedRunningTime="2026-04-20 20:08:57.316272502 +0000 UTC m=+172.163815556" watchObservedRunningTime="2026-04-20 20:08:57.338065627 +0000 UTC m=+172.185608669" Apr 20 20:08:57.458867 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.458749 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" Apr 20 20:08:57.645304 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:57.644844 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-66d56bfd8-82vtl"] Apr 20 20:08:57.804785 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:08:57.804739 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1163c73f_65e4_431f_926c_dbbdb381939d.slice/crio-f2a1670ec2e7d3a2a7721f15855e29e415915ada8c1734900252235342bec30d WatchSource:0}: Error finding container f2a1670ec2e7d3a2a7721f15855e29e415915ada8c1734900252235342bec30d: Status 404 returned error can't find the container with id f2a1670ec2e7d3a2a7721f15855e29e415915ada8c1734900252235342bec30d Apr 20 20:08:58.144217 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.143988 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:58.156567 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.156531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.161745 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.161067 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3ecf1ml8r5ghd\"" Apr 20 20:08:58.161745 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.161448 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:08:58.165411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.163118 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:08:58.165411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.163231 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cgq4b\"" Apr 20 20:08:58.165411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.163605 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:08:58.165411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.163827 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:08:58.165411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.165061 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:08:58.165411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.165193 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:08:58.165411 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.165271 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:08:58.166493 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.166073 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:08:58.166493 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.166263 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:08:58.166493 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.166356 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:08:58.166713 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.166520 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:08:58.167425 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.167385 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:08:58.167528 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.167421 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232463 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232493 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-web-config\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232572 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232605 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m4k8\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-kube-api-access-4m4k8\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232653 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232756 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232794 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232898 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232927 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.233293 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232958 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config-out\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.234220 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.232982 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.234220 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.233043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.275767 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.275703 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" event={"ID":"1163c73f-65e4-431f-926c-dbbdb381939d","Type":"ContainerStarted","Data":"f2a1670ec2e7d3a2a7721f15855e29e415915ada8c1734900252235342bec30d"} Apr 20 20:08:58.280039 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.279253 2579 generic.go:358] "Generic (PLEG): container finished" podID="e7242252-d37d-4f22-8f26-227293d15aed" containerID="068c09d60c8c5639199c603d1520ccb2ed4a13e4984adb453b6101779abcefec" exitCode=0 Apr 20 20:08:58.280039 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.279340 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m5wpz" event={"ID":"e7242252-d37d-4f22-8f26-227293d15aed","Type":"ContainerDied","Data":"068c09d60c8c5639199c603d1520ccb2ed4a13e4984adb453b6101779abcefec"} Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334478 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334504 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334558 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config-out\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334643 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334681 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334730 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334760 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-web-config\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m4k8\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-kube-api-access-4m4k8\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.335355 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.336821 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.334965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.336821 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.335825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.339198 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.337900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.340073 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.340036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.342536 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.340182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.342982 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.342659 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.344115 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.343301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.344115 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.343802 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.344787 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.344707 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.345181 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.345105 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.346523 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.346495 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.346741 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.346718 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config-out\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.346831 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.346787 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.347987 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.347200 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.347987 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.347431 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-web-config\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.347987 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.347569 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.348199 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.348010 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.349753 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.349730 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.351513 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.351487 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m4k8\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-kube-api-access-4m4k8\") pod \"prometheus-k8s-0\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.473802 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.473377 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:08:58.854805 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.854411 2579 patch_prober.go:28] interesting pod/image-registry-6598fdf965-7nq6v container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 20:08:58.854805 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:58.854490 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" podUID="609404ae-d83c-4bce-97c6-0c193b6b0485" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 20:08:59.289015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:08:59.288941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m5wpz" event={"ID":"e7242252-d37d-4f22-8f26-227293d15aed","Type":"ContainerStarted","Data":"ddec811175f85df8b5710c1fd65cd586021a6564b2ad6a3b80356ad6a94e6c0b"} Apr 20 20:09:00.196192 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:00.196112 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6598fdf965-7nq6v" Apr 20 20:09:01.980141 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:01.980107 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:09:01.985414 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:09:01.985311 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43ef1974_2bb9_4c57_873d_71f4079ce1a4.slice/crio-776275f9eb046ec2cd919df1cf09ba8a7dbb13d8948b5b91a1817b508cce4834 WatchSource:0}: Error finding container 776275f9eb046ec2cd919df1cf09ba8a7dbb13d8948b5b91a1817b508cce4834: Status 404 returned error can't find the container with id 776275f9eb046ec2cd919df1cf09ba8a7dbb13d8948b5b91a1817b508cce4834 Apr 20 20:09:02.300135 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.300046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerStarted","Data":"776275f9eb046ec2cd919df1cf09ba8a7dbb13d8948b5b91a1817b508cce4834"} Apr 20 20:09:02.302501 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.302465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" event={"ID":"df8856cf-ad0c-428d-936d-9ac93ef7d700","Type":"ContainerStarted","Data":"f8cb4131be035fd63d53649238037599d9f3968d28dd0d285cf9aa4bc5985967"} Apr 20 20:09:02.302501 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.302502 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" event={"ID":"df8856cf-ad0c-428d-936d-9ac93ef7d700","Type":"ContainerStarted","Data":"c518bad713eef45dfc7e924cd5823cfd75d6fa90afb4edb56e61d12809262513"} Apr 20 20:09:02.302696 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.302518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" event={"ID":"df8856cf-ad0c-428d-936d-9ac93ef7d700","Type":"ContainerStarted","Data":"a388df93e19bc1a50b0704bbce853d8cb187e3a009b10862153f3f41f5ac4ed5"} Apr 20 20:09:02.305061 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.305016 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-m5wpz" event={"ID":"e7242252-d37d-4f22-8f26-227293d15aed","Type":"ContainerStarted","Data":"cc3c90eb755cbf9b70983fc3fe9ac3ff06107eb9ed831848c1b0a1d7980527d3"} Apr 20 20:09:02.306675 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.306636 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" event={"ID":"4e5316e3-2107-49b1-b744-bd92a811e921","Type":"ContainerStarted","Data":"be44a8954b6e8ba8258b7cf41e8b14daae4bed0a72e92fdf58349a67bbe3c0be"} Apr 20 20:09:02.308314 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.308252 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" event={"ID":"4031d84e-286b-453d-b9bf-80024787b990","Type":"ContainerStarted","Data":"2ff97dfedcdde3907e716aaab235b7a9cfb95c65755b40b2602c0a82b79afdf6"} Apr 20 20:09:02.308497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.308480 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" Apr 20 20:09:02.309836 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.309794 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" event={"ID":"1163c73f-65e4-431f-926c-dbbdb381939d","Type":"ContainerStarted","Data":"ac777e76259ce928cd2b7d4657018e9bbca6b0ca6ebe6daf73b12d03e301d5eb"} Apr 20 20:09:02.315230 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.315204 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" Apr 20 20:09:02.349535 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.349459 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-m5wpz" podStartSLOduration=10.339782456 podStartE2EDuration="11.349439644s" podCreationTimestamp="2026-04-20 20:08:51 +0000 UTC" firstStartedPulling="2026-04-20 20:08:56.848317532 +0000 UTC m=+171.695860554" lastFinishedPulling="2026-04-20 20:08:57.857974715 +0000 UTC m=+172.705517742" observedRunningTime="2026-04-20 20:09:02.329246467 +0000 UTC m=+177.176789533" watchObservedRunningTime="2026-04-20 20:09:02.349439644 +0000 UTC m=+177.196982687" Apr 20 20:09:02.350102 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.350064 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" podStartSLOduration=1.5660587719999999 podStartE2EDuration="6.350052785s" podCreationTimestamp="2026-04-20 20:08:56 +0000 UTC" firstStartedPulling="2026-04-20 20:08:57.026733014 +0000 UTC m=+171.874276048" lastFinishedPulling="2026-04-20 20:09:01.810727043 +0000 UTC m=+176.658270061" observedRunningTime="2026-04-20 20:09:02.349198518 +0000 UTC m=+177.196741563" watchObservedRunningTime="2026-04-20 20:09:02.350052785 +0000 UTC m=+177.197595854" Apr 20 20:09:02.362187 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:02.362128 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-n95gd" podStartSLOduration=1.7484718049999999 podStartE2EDuration="6.362111658s" podCreationTimestamp="2026-04-20 20:08:56 +0000 UTC" firstStartedPulling="2026-04-20 20:08:57.19811254 +0000 UTC m=+172.045655566" lastFinishedPulling="2026-04-20 20:09:01.8117524 +0000 UTC m=+176.659295419" observedRunningTime="2026-04-20 20:09:02.361785084 +0000 UTC m=+177.209328132" watchObservedRunningTime="2026-04-20 20:09:02.362111658 +0000 UTC m=+177.209654699" Apr 20 20:09:05.327974 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.327921 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" event={"ID":"df8856cf-ad0c-428d-936d-9ac93ef7d700","Type":"ContainerStarted","Data":"963ab57ceda4ae4fd80cafa2129052cdb012cc91cf5753bdd2921fc903a712ab"} Apr 20 20:09:05.327974 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.327981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" event={"ID":"df8856cf-ad0c-428d-936d-9ac93ef7d700","Type":"ContainerStarted","Data":"162033738f19592ab4b4dc7eeecf454e8d361547ac5b40504d13266e7cf54dc1"} Apr 20 20:09:05.328566 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.328002 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" event={"ID":"df8856cf-ad0c-428d-936d-9ac93ef7d700","Type":"ContainerStarted","Data":"f6bc03c25ef7a2ff6c805cd581057e12da119472c3663e26294aeda33b30fa0f"} Apr 20 20:09:05.328566 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.328115 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:09:05.330278 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.330247 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" event={"ID":"1163c73f-65e4-431f-926c-dbbdb381939d","Type":"ContainerStarted","Data":"12162bd39cc389c8590805d432d23b7b6b33c6886f45520f6fa4bddec08d0f11"} Apr 20 20:09:05.330417 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.330282 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" event={"ID":"1163c73f-65e4-431f-926c-dbbdb381939d","Type":"ContainerStarted","Data":"13ac0d1581623894277a157de7aba8cbe89f236a22b3217fd9b87e401ce845fb"} Apr 20 20:09:05.332174 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.332145 2579 generic.go:358] "Generic (PLEG): container finished" podID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" exitCode=0 Apr 20 20:09:05.332282 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.332187 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18"} Apr 20 20:09:05.354324 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.354260 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" podStartSLOduration=4.243859225 podStartE2EDuration="11.354239191s" podCreationTimestamp="2026-04-20 20:08:54 +0000 UTC" firstStartedPulling="2026-04-20 20:08:57.067096429 +0000 UTC m=+171.914639454" lastFinishedPulling="2026-04-20 20:09:04.177476391 +0000 UTC m=+179.025019420" observedRunningTime="2026-04-20 20:09:05.34901542 +0000 UTC m=+180.196558462" watchObservedRunningTime="2026-04-20 20:09:05.354239191 +0000 UTC m=+180.201782243" Apr 20 20:09:05.370292 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:05.370229 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-66d56bfd8-82vtl" podStartSLOduration=2.001242578 podStartE2EDuration="8.370213992s" podCreationTimestamp="2026-04-20 20:08:57 +0000 UTC" firstStartedPulling="2026-04-20 20:08:57.809204242 +0000 UTC m=+172.656747267" lastFinishedPulling="2026-04-20 20:09:04.178175656 +0000 UTC m=+179.025718681" observedRunningTime="2026-04-20 20:09:05.370187638 +0000 UTC m=+180.217730681" watchObservedRunningTime="2026-04-20 20:09:05.370213992 +0000 UTC m=+180.217757033" Apr 20 20:09:08.292410 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:08.292360 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-54k7f" Apr 20 20:09:09.353709 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:09.353676 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerStarted","Data":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} Apr 20 20:09:09.354098 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:09.353718 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerStarted","Data":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} Apr 20 20:09:09.354098 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:09.353732 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerStarted","Data":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} Apr 20 20:09:09.354098 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:09.353744 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerStarted","Data":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} Apr 20 20:09:10.359248 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:10.359206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerStarted","Data":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} Apr 20 20:09:10.359248 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:10.359251 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerStarted","Data":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} Apr 20 20:09:10.391542 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:10.391483 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.542386071 podStartE2EDuration="12.391462725s" podCreationTimestamp="2026-04-20 20:08:58 +0000 UTC" firstStartedPulling="2026-04-20 20:09:01.987282116 +0000 UTC m=+176.834825136" lastFinishedPulling="2026-04-20 20:09:08.836358769 +0000 UTC m=+183.683901790" observedRunningTime="2026-04-20 20:09:10.390817644 +0000 UTC m=+185.238360696" watchObservedRunningTime="2026-04-20 20:09:10.391462725 +0000 UTC m=+185.239005767" Apr 20 20:09:11.342511 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:11.342484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-668f954756-5c6pq" Apr 20 20:09:13.474314 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:13.474269 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:16.633930 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:16.633892 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:09:16.633930 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:16.633938 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:09:20.394526 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:20.394437 2579 generic.go:358] "Generic (PLEG): container finished" podID="b41883eb-364a-40b1-af2a-729a431739b1" containerID="dd3b7068a572639e564958e1596a277b0aeda7de595c79b7670f9ccadfcce7ae" exitCode=0 Apr 20 20:09:20.394526 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:20.394511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" event={"ID":"b41883eb-364a-40b1-af2a-729a431739b1","Type":"ContainerDied","Data":"dd3b7068a572639e564958e1596a277b0aeda7de595c79b7670f9ccadfcce7ae"} Apr 20 20:09:20.394947 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:20.394866 2579 scope.go:117] "RemoveContainer" containerID="dd3b7068a572639e564958e1596a277b0aeda7de595c79b7670f9ccadfcce7ae" Apr 20 20:09:21.401885 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:21.401836 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-l42zw" event={"ID":"b41883eb-364a-40b1-af2a-729a431739b1","Type":"ContainerStarted","Data":"2935bc5b37f4a7a448476f8cf84aa6d54f86972899728e3c95ee916a2579f2f3"} Apr 20 20:09:25.415942 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:25.415908 2579 generic.go:358] "Generic (PLEG): container finished" podID="09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd" containerID="dbdefc6865e7401770c26f9ca3fb086ba0314d547f5b0520ec8c7411bda2018e" exitCode=0 Apr 20 20:09:25.416401 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:25.415966 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" event={"ID":"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd","Type":"ContainerDied","Data":"dbdefc6865e7401770c26f9ca3fb086ba0314d547f5b0520ec8c7411bda2018e"} Apr 20 20:09:25.416401 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:25.416354 2579 scope.go:117] "RemoveContainer" containerID="dbdefc6865e7401770c26f9ca3fb086ba0314d547f5b0520ec8c7411bda2018e" Apr 20 20:09:26.420208 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:26.420173 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-28z6v" event={"ID":"09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd","Type":"ContainerStarted","Data":"f1df8bd80571b4ff7c87e34258520743299ba8d14fb979ee28824a52096c28cf"} Apr 20 20:09:36.640349 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:36.640317 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:09:36.644405 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:36.644373 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-57958c75ff-6lnmd" Apr 20 20:09:58.474595 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:58.474538 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:58.495135 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:58.495102 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:09:58.536972 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:09:58.536932 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:16.663902 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.663846 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:10:16.664460 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.664395 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="prometheus" containerID="cri-o://ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" gracePeriod=600 Apr 20 20:10:16.664460 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.664425 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="thanos-sidecar" containerID="cri-o://0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" gracePeriod=600 Apr 20 20:10:16.664607 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.664453 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-web" containerID="cri-o://73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" gracePeriod=600 Apr 20 20:10:16.664607 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.664425 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy" containerID="cri-o://6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" gracePeriod=600 Apr 20 20:10:16.664607 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.664472 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-thanos" containerID="cri-o://cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" gracePeriod=600 Apr 20 20:10:16.664756 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.664618 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="config-reloader" containerID="cri-o://d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" gracePeriod=600 Apr 20 20:10:16.909635 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:16.909609 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.079553 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079511 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-tls\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.079553 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079567 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-kubelet-serving-ca-bundle\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.079834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079607 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m4k8\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-kube-api-access-4m4k8\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.079834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079631 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-metrics-client-ca\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.079834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079747 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.079834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079801 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-web-config\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.079834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079825 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-grpc-tls\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079863 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-tls-assets\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079925 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-kube-rbac-proxy\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079960 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-db\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.079997 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080024 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-serving-certs-ca-bundle\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080055 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080082 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080089 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:17.080127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080108 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-trusted-ca-bundle\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080574 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080150 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config-out\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080574 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080191 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-rulefiles-0\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080574 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080217 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-thanos-prometheus-http-client-file\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080574 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080267 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-metrics-client-certs\") pod \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\" (UID: \"43ef1974-2bb9-4c57-873d-71f4079ce1a4\") " Apr 20 20:10:17.080796 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080775 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.080850 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.080804 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-metrics-client-ca\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.081124 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.081102 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:10:17.082996 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.082959 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.083138 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.083004 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-kube-api-access-4m4k8" (OuterVolumeSpecName: "kube-api-access-4m4k8") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "kube-api-access-4m4k8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:10:17.083138 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.083039 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.083138 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.083079 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.083318 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.083207 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.083375 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.083321 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:17.083453 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.083424 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config" (OuterVolumeSpecName: "config") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.083571 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.083529 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.084236 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.084208 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:17.084584 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.084555 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 20:10:17.085497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.085472 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.085687 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.085668 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config-out" (OuterVolumeSpecName: "config-out") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:10:17.085933 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.085905 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:10:17.086027 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.085974 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.097194 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.097155 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-web-config" (OuterVolumeSpecName: "web-config") pod "43ef1974-2bb9-4c57-873d-71f4079ce1a4" (UID: "43ef1974-2bb9-4c57-873d-71f4079ce1a4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 20:10:17.181802 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181770 2579 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config-out\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.181802 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181802 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181813 2579 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181825 2579 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-metrics-client-certs\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181836 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181844 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4m4k8\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-kube-api-access-4m4k8\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181853 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181863 2579 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-web-config\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181893 2579 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-grpc-tls\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181906 2579 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43ef1974-2bb9-4c57-873d-71f4079ce1a4-tls-assets\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181917 2579 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-kube-rbac-proxy\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181931 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-k8s-db\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181944 2579 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-config\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181953 2579 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181963 2579 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/43ef1974-2bb9-4c57-873d-71f4079ce1a4-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.182004 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.181972 2579 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43ef1974-2bb9-4c57-873d-71f4079ce1a4-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:10:17.485779 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.485737 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:10:17.488372 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.488347 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5020f6ce-7061-484f-9b7d-89141a36e42c-metrics-certs\") pod \"network-metrics-daemon-g7wqd\" (UID: \"5020f6ce-7061-484f-9b7d-89141a36e42c\") " pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:10:17.579498 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579459 2579 generic.go:358] "Generic (PLEG): container finished" podID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" exitCode=0 Apr 20 20:10:17.579498 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579487 2579 generic.go:358] "Generic (PLEG): container finished" podID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" exitCode=0 Apr 20 20:10:17.579498 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579494 2579 generic.go:358] "Generic (PLEG): container finished" podID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" exitCode=0 Apr 20 20:10:17.579498 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579499 2579 generic.go:358] "Generic (PLEG): container finished" podID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" exitCode=0 Apr 20 20:10:17.579498 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579504 2579 generic.go:358] "Generic (PLEG): container finished" podID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" exitCode=0 Apr 20 20:10:17.579498 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579510 2579 generic.go:358] "Generic (PLEG): container finished" podID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" exitCode=0 Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579549 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579582 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579598 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579614 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579627 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579643 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579657 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579666 2579 scope.go:117] "RemoveContainer" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.579840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.579671 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"43ef1974-2bb9-4c57-873d-71f4079ce1a4","Type":"ContainerDied","Data":"776275f9eb046ec2cd919df1cf09ba8a7dbb13d8948b5b91a1817b508cce4834"} Apr 20 20:10:17.588433 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.588407 2579 scope.go:117] "RemoveContainer" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.597025 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.597003 2579 scope.go:117] "RemoveContainer" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.604239 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.604156 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:10:17.609184 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.609144 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:10:17.609774 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.609750 2579 scope.go:117] "RemoveContainer" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.617678 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.617650 2579 scope.go:117] "RemoveContainer" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.625461 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.625436 2579 scope.go:117] "RemoveContainer" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.634066 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634042 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:10:17.634146 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634102 2579 scope.go:117] "RemoveContainer" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.634456 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634439 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="prometheus" Apr 20 20:10:17.634456 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634456 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="prometheus" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634466 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="thanos-sidecar" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634475 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="thanos-sidecar" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634486 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634495 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634514 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-thanos" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634520 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-thanos" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634530 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-web" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634535 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-web" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634542 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="init-config-reloader" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634547 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="init-config-reloader" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634556 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="config-reloader" Apr 20 20:10:17.634556 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634561 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="config-reloader" Apr 20 20:10:17.634990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634610 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-thanos" Apr 20 20:10:17.634990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634617 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="prometheus" Apr 20 20:10:17.634990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634625 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy-web" Apr 20 20:10:17.634990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634632 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="config-reloader" Apr 20 20:10:17.634990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634641 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="kube-rbac-proxy" Apr 20 20:10:17.634990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.634651 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" containerName="thanos-sidecar" Apr 20 20:10:17.638517 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.638495 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4tpsm\"" Apr 20 20:10:17.640208 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.640185 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.642466 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.642443 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 20:10:17.642617 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.642569 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 20:10:17.642688 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.642635 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-cgq4b\"" Apr 20 20:10:17.642688 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.642669 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 20:10:17.642787 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.642769 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 20:10:17.643001 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.642984 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3ecf1ml8r5ghd\"" Apr 20 20:10:17.643068 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.643019 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 20:10:17.643529 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.643511 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 20:10:17.643692 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.643658 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 20:10:17.643786 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.643692 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 20:10:17.643786 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.643682 2579 scope.go:117] "RemoveContainer" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.644055 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.643661 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 20:10:17.644055 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.643695 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 20:10:17.644189 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:10:17.644164 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": container with ID starting with cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a not found: ID does not exist" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.644241 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.644202 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} err="failed to get container status \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": rpc error: code = NotFound desc = could not find container \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": container with ID starting with cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a not found: ID does not exist" Apr 20 20:10:17.644284 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.644245 2579 scope.go:117] "RemoveContainer" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.644555 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:10:17.644535 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": container with ID starting with 6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8 not found: ID does not exist" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.644643 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.644565 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} err="failed to get container status \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": rpc error: code = NotFound desc = could not find container \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": container with ID starting with 6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8 not found: ID does not exist" Apr 20 20:10:17.644643 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.644589 2579 scope.go:117] "RemoveContainer" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.644908 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:10:17.644865 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": container with ID starting with 73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae not found: ID does not exist" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.644990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.644914 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} err="failed to get container status \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": rpc error: code = NotFound desc = could not find container \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": container with ID starting with 73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae not found: ID does not exist" Apr 20 20:10:17.644990 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.644933 2579 scope.go:117] "RemoveContainer" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.645288 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:10:17.645264 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": container with ID starting with 0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc not found: ID does not exist" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.645366 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.645298 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} err="failed to get container status \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": rpc error: code = NotFound desc = could not find container \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": container with ID starting with 0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc not found: ID does not exist" Apr 20 20:10:17.645366 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.645322 2579 scope.go:117] "RemoveContainer" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.645622 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:10:17.645599 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": container with ID starting with d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986 not found: ID does not exist" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.645728 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.645621 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} err="failed to get container status \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": rpc error: code = NotFound desc = could not find container \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": container with ID starting with d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986 not found: ID does not exist" Apr 20 20:10:17.645728 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.645636 2579 scope.go:117] "RemoveContainer" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.645960 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:10:17.645935 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": container with ID starting with ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd not found: ID does not exist" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.646087 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.645968 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} err="failed to get container status \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": rpc error: code = NotFound desc = could not find container \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": container with ID starting with ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd not found: ID does not exist" Apr 20 20:10:17.646087 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.645990 2579 scope.go:117] "RemoveContainer" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.646211 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.646185 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 20:10:17.646256 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.646246 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-g7wqd" Apr 20 20:10:17.646742 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:10:17.646709 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": container with ID starting with 54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18 not found: ID does not exist" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.646818 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.646748 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18"} err="failed to get container status \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": rpc error: code = NotFound desc = could not find container \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": container with ID starting with 54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18 not found: ID does not exist" Apr 20 20:10:17.646818 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.646768 2579 scope.go:117] "RemoveContainer" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.647052 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.647019 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} err="failed to get container status \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": rpc error: code = NotFound desc = could not find container \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": container with ID starting with cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a not found: ID does not exist" Apr 20 20:10:17.647052 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.647039 2579 scope.go:117] "RemoveContainer" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.647337 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.647315 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} err="failed to get container status \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": rpc error: code = NotFound desc = could not find container \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": container with ID starting with 6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8 not found: ID does not exist" Apr 20 20:10:17.647433 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.647339 2579 scope.go:117] "RemoveContainer" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.647634 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.647607 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} err="failed to get container status \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": rpc error: code = NotFound desc = could not find container \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": container with ID starting with 73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae not found: ID does not exist" Apr 20 20:10:17.647634 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.647634 2579 scope.go:117] "RemoveContainer" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.648311 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.648286 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} err="failed to get container status \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": rpc error: code = NotFound desc = could not find container \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": container with ID starting with 0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc not found: ID does not exist" Apr 20 20:10:17.648428 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.648313 2579 scope.go:117] "RemoveContainer" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.648611 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.648588 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} err="failed to get container status \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": rpc error: code = NotFound desc = could not find container \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": container with ID starting with d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986 not found: ID does not exist" Apr 20 20:10:17.648698 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.648613 2579 scope.go:117] "RemoveContainer" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.649025 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.648944 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} err="failed to get container status \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": rpc error: code = NotFound desc = could not find container \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": container with ID starting with ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd not found: ID does not exist" Apr 20 20:10:17.649025 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.648970 2579 scope.go:117] "RemoveContainer" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.649320 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.649288 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18"} err="failed to get container status \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": rpc error: code = NotFound desc = could not find container \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": container with ID starting with 54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18 not found: ID does not exist" Apr 20 20:10:17.649430 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.649321 2579 scope.go:117] "RemoveContainer" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.649483 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.649474 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 20:10:17.649644 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.649619 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} err="failed to get container status \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": rpc error: code = NotFound desc = could not find container \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": container with ID starting with cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a not found: ID does not exist" Apr 20 20:10:17.649733 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.649648 2579 scope.go:117] "RemoveContainer" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.650049 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.650027 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} err="failed to get container status \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": rpc error: code = NotFound desc = could not find container \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": container with ID starting with 6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8 not found: ID does not exist" Apr 20 20:10:17.650145 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.650050 2579 scope.go:117] "RemoveContainer" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.650365 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.650338 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} err="failed to get container status \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": rpc error: code = NotFound desc = could not find container \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": container with ID starting with 73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae not found: ID does not exist" Apr 20 20:10:17.650365 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.650364 2579 scope.go:117] "RemoveContainer" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.650659 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.650635 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} err="failed to get container status \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": rpc error: code = NotFound desc = could not find container \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": container with ID starting with 0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc not found: ID does not exist" Apr 20 20:10:17.650753 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.650659 2579 scope.go:117] "RemoveContainer" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.651498 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.651476 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} err="failed to get container status \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": rpc error: code = NotFound desc = could not find container \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": container with ID starting with d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986 not found: ID does not exist" Apr 20 20:10:17.651596 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.651499 2579 scope.go:117] "RemoveContainer" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.651837 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.651791 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} err="failed to get container status \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": rpc error: code = NotFound desc = could not find container \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": container with ID starting with ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd not found: ID does not exist" Apr 20 20:10:17.651837 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.651814 2579 scope.go:117] "RemoveContainer" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.652376 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.652345 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18"} err="failed to get container status \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": rpc error: code = NotFound desc = could not find container \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": container with ID starting with 54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18 not found: ID does not exist" Apr 20 20:10:17.652462 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.652378 2579 scope.go:117] "RemoveContainer" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.652654 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.652631 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} err="failed to get container status \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": rpc error: code = NotFound desc = could not find container \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": container with ID starting with cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a not found: ID does not exist" Apr 20 20:10:17.652705 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.652657 2579 scope.go:117] "RemoveContainer" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.652744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.652726 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:10:17.652969 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.652948 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} err="failed to get container status \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": rpc error: code = NotFound desc = could not find container \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": container with ID starting with 6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8 not found: ID does not exist" Apr 20 20:10:17.653031 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.652971 2579 scope.go:117] "RemoveContainer" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.653232 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653207 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} err="failed to get container status \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": rpc error: code = NotFound desc = could not find container \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": container with ID starting with 73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae not found: ID does not exist" Apr 20 20:10:17.653290 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653236 2579 scope.go:117] "RemoveContainer" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.653480 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653460 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} err="failed to get container status \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": rpc error: code = NotFound desc = could not find container \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": container with ID starting with 0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc not found: ID does not exist" Apr 20 20:10:17.653537 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653481 2579 scope.go:117] "RemoveContainer" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.653699 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653680 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} err="failed to get container status \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": rpc error: code = NotFound desc = could not find container \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": container with ID starting with d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986 not found: ID does not exist" Apr 20 20:10:17.653699 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653699 2579 scope.go:117] "RemoveContainer" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.653935 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653919 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} err="failed to get container status \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": rpc error: code = NotFound desc = could not find container \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": container with ID starting with ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd not found: ID does not exist" Apr 20 20:10:17.653935 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.653934 2579 scope.go:117] "RemoveContainer" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.654120 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654106 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18"} err="failed to get container status \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": rpc error: code = NotFound desc = could not find container \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": container with ID starting with 54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18 not found: ID does not exist" Apr 20 20:10:17.654120 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654119 2579 scope.go:117] "RemoveContainer" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.654306 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654291 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} err="failed to get container status \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": rpc error: code = NotFound desc = could not find container \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": container with ID starting with cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a not found: ID does not exist" Apr 20 20:10:17.654306 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654305 2579 scope.go:117] "RemoveContainer" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.654529 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654504 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} err="failed to get container status \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": rpc error: code = NotFound desc = could not find container \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": container with ID starting with 6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8 not found: ID does not exist" Apr 20 20:10:17.654606 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654531 2579 scope.go:117] "RemoveContainer" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.654781 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654751 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} err="failed to get container status \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": rpc error: code = NotFound desc = could not find container \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": container with ID starting with 73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae not found: ID does not exist" Apr 20 20:10:17.654781 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.654781 2579 scope.go:117] "RemoveContainer" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.655086 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655068 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} err="failed to get container status \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": rpc error: code = NotFound desc = could not find container \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": container with ID starting with 0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc not found: ID does not exist" Apr 20 20:10:17.655144 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655087 2579 scope.go:117] "RemoveContainer" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.655331 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655310 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} err="failed to get container status \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": rpc error: code = NotFound desc = could not find container \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": container with ID starting with d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986 not found: ID does not exist" Apr 20 20:10:17.655421 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655331 2579 scope.go:117] "RemoveContainer" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.655609 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655589 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} err="failed to get container status \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": rpc error: code = NotFound desc = could not find container \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": container with ID starting with ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd not found: ID does not exist" Apr 20 20:10:17.655668 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655609 2579 scope.go:117] "RemoveContainer" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.655846 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655830 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18"} err="failed to get container status \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": rpc error: code = NotFound desc = could not find container \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": container with ID starting with 54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18 not found: ID does not exist" Apr 20 20:10:17.655926 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.655846 2579 scope.go:117] "RemoveContainer" containerID="cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a" Apr 20 20:10:17.656089 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656069 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a"} err="failed to get container status \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": rpc error: code = NotFound desc = could not find container \"cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a\": container with ID starting with cd1b3bc541a01bb4ca484e7785afbd2d81b7e2785552b499420facc44c2f9a4a not found: ID does not exist" Apr 20 20:10:17.656162 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656091 2579 scope.go:117] "RemoveContainer" containerID="6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8" Apr 20 20:10:17.656369 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656345 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8"} err="failed to get container status \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": rpc error: code = NotFound desc = could not find container \"6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8\": container with ID starting with 6878bc6a7358a1a10f53ac46211290e196f6d2c837939194c4dcc2e26affd9f8 not found: ID does not exist" Apr 20 20:10:17.656369 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656367 2579 scope.go:117] "RemoveContainer" containerID="73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae" Apr 20 20:10:17.656636 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656613 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae"} err="failed to get container status \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": rpc error: code = NotFound desc = could not find container \"73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae\": container with ID starting with 73f527ce02f31110134a47d12442fea194983122bd3d0c11ec8e17b0e66c9eae not found: ID does not exist" Apr 20 20:10:17.656726 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656638 2579 scope.go:117] "RemoveContainer" containerID="0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc" Apr 20 20:10:17.657003 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656868 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc"} err="failed to get container status \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": rpc error: code = NotFound desc = could not find container \"0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc\": container with ID starting with 0d281b2f9ea9b26fc7b938886b2413c1610590a3ac2eebddf3dba1b7c1a955cc not found: ID does not exist" Apr 20 20:10:17.657003 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.656913 2579 scope.go:117] "RemoveContainer" containerID="d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986" Apr 20 20:10:17.657158 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.657141 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986"} err="failed to get container status \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": rpc error: code = NotFound desc = could not find container \"d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986\": container with ID starting with d3cd6ebfb0984d38f8a28b392bc0954cd0623ceddccd84a5e5e1af7bad8e5986 not found: ID does not exist" Apr 20 20:10:17.657198 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.657160 2579 scope.go:117] "RemoveContainer" containerID="ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd" Apr 20 20:10:17.657571 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.657555 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd"} err="failed to get container status \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": rpc error: code = NotFound desc = could not find container \"ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd\": container with ID starting with ddc2c7876a336f55d02b1b42645b5501bad66f8f63d3048095d8b0ce1e2b73bd not found: ID does not exist" Apr 20 20:10:17.657621 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.657572 2579 scope.go:117] "RemoveContainer" containerID="54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18" Apr 20 20:10:17.657811 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.657793 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18"} err="failed to get container status \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": rpc error: code = NotFound desc = could not find container \"54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18\": container with ID starting with 54eb57a9fd1b0183bf08b3c2482e8b123ef7df7458d357f6e96eda6dad5cfc18 not found: ID does not exist" Apr 20 20:10:17.737540 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.737458 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43ef1974-2bb9-4c57-873d-71f4079ce1a4" path="/var/lib/kubelet/pods/43ef1974-2bb9-4c57-873d-71f4079ce1a4/volumes" Apr 20 20:10:17.781019 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.780948 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-g7wqd"] Apr 20 20:10:17.783721 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:10:17.783692 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5020f6ce_7061_484f_9b7d_89141a36e42c.slice/crio-7f820eeb8e7f71efb61921e107f47f05c6b337e595fb31080d70310bbc75900b WatchSource:0}: Error finding container 7f820eeb8e7f71efb61921e107f47f05c6b337e595fb31080d70310bbc75900b: Status 404 returned error can't find the container with id 7f820eeb8e7f71efb61921e107f47f05c6b337e595fb31080d70310bbc75900b Apr 20 20:10:17.788235 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788349 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788244 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788349 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788349 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788482 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788352 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788482 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788383 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-web-config\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788482 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788482 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788443 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788482 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788472 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788702 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788702 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788528 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791328e6-9d30-4dfc-9c1e-166eb176bc8b-config-out\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788702 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788549 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-config\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788702 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788702 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788616 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgfq\" (UniqueName: \"kubernetes.io/projected/791328e6-9d30-4dfc-9c1e-166eb176bc8b-kube-api-access-dfgfq\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788961 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791328e6-9d30-4dfc-9c1e-166eb176bc8b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788961 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788757 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788961 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788894 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.788961 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.788934 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889377 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889337 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889377 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889379 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889646 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889646 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889421 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-web-config\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889646 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889439 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889646 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889464 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889646 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889852 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889670 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889852 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889705 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791328e6-9d30-4dfc-9c1e-166eb176bc8b-config-out\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889852 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889734 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-config\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889852 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889759 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889852 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889783 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgfq\" (UniqueName: \"kubernetes.io/projected/791328e6-9d30-4dfc-9c1e-166eb176bc8b-kube-api-access-dfgfq\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.889852 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791328e6-9d30-4dfc-9c1e-166eb176bc8b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890157 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889868 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890157 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.889972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890157 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.890007 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890157 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.890044 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890157 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.890083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890398 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.890235 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890398 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.890235 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.890500 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.890394 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.891195 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.891166 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.892375 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.892344 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.892964 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.892919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.893721 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.893694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.894108 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.894077 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.894204 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.894185 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-web-config\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.894382 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.894361 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/791328e6-9d30-4dfc-9c1e-166eb176bc8b-config-out\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.894592 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.894568 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.894673 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.894572 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.895085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.895056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.895519 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.895498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-config\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.895900 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.895850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/791328e6-9d30-4dfc-9c1e-166eb176bc8b-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.895985 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.895853 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/791328e6-9d30-4dfc-9c1e-166eb176bc8b-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.897148 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.897124 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/791328e6-9d30-4dfc-9c1e-166eb176bc8b-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.901923 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.901849 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgfq\" (UniqueName: \"kubernetes.io/projected/791328e6-9d30-4dfc-9c1e-166eb176bc8b-kube-api-access-dfgfq\") pod \"prometheus-k8s-0\" (UID: \"791328e6-9d30-4dfc-9c1e-166eb176bc8b\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:17.953792 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:17.953753 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:18.102981 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:18.102955 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 20:10:18.105087 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:10:18.105049 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791328e6_9d30_4dfc_9c1e_166eb176bc8b.slice/crio-678a0e93e66aa1f19f0980e8ca59eeb19f7fda8959074f5feef36236f990e7ad WatchSource:0}: Error finding container 678a0e93e66aa1f19f0980e8ca59eeb19f7fda8959074f5feef36236f990e7ad: Status 404 returned error can't find the container with id 678a0e93e66aa1f19f0980e8ca59eeb19f7fda8959074f5feef36236f990e7ad Apr 20 20:10:18.584437 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:18.584396 2579 generic.go:358] "Generic (PLEG): container finished" podID="791328e6-9d30-4dfc-9c1e-166eb176bc8b" containerID="d2c952b9d7213a93c27c0a88065c9c2ec1b7085058306df94c8cf4fcf79d1179" exitCode=0 Apr 20 20:10:18.584628 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:18.584483 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerDied","Data":"d2c952b9d7213a93c27c0a88065c9c2ec1b7085058306df94c8cf4fcf79d1179"} Apr 20 20:10:18.584628 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:18.584520 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerStarted","Data":"678a0e93e66aa1f19f0980e8ca59eeb19f7fda8959074f5feef36236f990e7ad"} Apr 20 20:10:18.585861 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:18.585833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7wqd" event={"ID":"5020f6ce-7061-484f-9b7d-89141a36e42c","Type":"ContainerStarted","Data":"7f820eeb8e7f71efb61921e107f47f05c6b337e595fb31080d70310bbc75900b"} Apr 20 20:10:19.593959 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.593920 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerStarted","Data":"e88f9ddb09e283b74ac62f8fbcfc2537a278d3288b2485a36ce3088242f472bd"} Apr 20 20:10:19.594442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.593967 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerStarted","Data":"0b2ed1f1a4864e10f1035fdb3e456aa93db57cf0b6aae680d2350da0c664d9d5"} Apr 20 20:10:19.594442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.593981 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerStarted","Data":"3d2c61809ba6dcbc027a8793b8a151fa217d89719333674cbb297254ffa4866d"} Apr 20 20:10:19.594442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.593992 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerStarted","Data":"618dc1a072f07e6344e18b07456fc26f528d4adcaab69c032b211fe7fbd70d94"} Apr 20 20:10:19.594442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.594004 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerStarted","Data":"02ecbb5907b26d23caf5e5ab8a5759f333822d9a2ce97222d8a61be4691e1bb0"} Apr 20 20:10:19.594442 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.594015 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"791328e6-9d30-4dfc-9c1e-166eb176bc8b","Type":"ContainerStarted","Data":"555ffbeaa2f1822644cab48c8e290fc90a6428e1fa2ddfb4cb8cfac16bff5f21"} Apr 20 20:10:19.595654 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.595628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7wqd" event={"ID":"5020f6ce-7061-484f-9b7d-89141a36e42c","Type":"ContainerStarted","Data":"848d2dd6db1945aef10d0617b3dfee9fa78be2904a58170ed976e75b3e36e029"} Apr 20 20:10:19.595722 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.595659 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-g7wqd" event={"ID":"5020f6ce-7061-484f-9b7d-89141a36e42c","Type":"ContainerStarted","Data":"5a2a589506ce2c83b2ecd50b017b27a2ded76fbef040b16f492b35181db9b9f0"} Apr 20 20:10:19.619854 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.619801 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.619786375 podStartE2EDuration="2.619786375s" podCreationTimestamp="2026-04-20 20:10:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:10:19.618410558 +0000 UTC m=+254.465953600" watchObservedRunningTime="2026-04-20 20:10:19.619786375 +0000 UTC m=+254.467329416" Apr 20 20:10:19.633991 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:19.633925 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-g7wqd" podStartSLOduration=253.611482385 podStartE2EDuration="4m14.633909411s" podCreationTimestamp="2026-04-20 20:06:05 +0000 UTC" firstStartedPulling="2026-04-20 20:10:17.786036482 +0000 UTC m=+252.633579501" lastFinishedPulling="2026-04-20 20:10:18.808463494 +0000 UTC m=+253.656006527" observedRunningTime="2026-04-20 20:10:19.632795531 +0000 UTC m=+254.480338569" watchObservedRunningTime="2026-04-20 20:10:19.633909411 +0000 UTC m=+254.481452451" Apr 20 20:10:22.954116 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:22.954067 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:10:39.371403 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.371358 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-s6tg6"] Apr 20 20:10:39.376937 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.376911 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.379781 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.379741 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 20:10:39.381416 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.381386 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s6tg6"] Apr 20 20:10:39.493840 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.493803 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-dbus\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.494068 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.493905 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-kubelet-config\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.494068 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.493927 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-original-pull-secret\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.595085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.595041 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-kubelet-config\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.595085 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.595083 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-original-pull-secret\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.595329 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.595135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-dbus\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.595329 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.595203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-kubelet-config\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.595329 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.595269 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-dbus\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.597640 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.597609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/6e654a96-5dff-4bf5-bbaa-d9360ce9afb1-original-pull-secret\") pod \"global-pull-secret-syncer-s6tg6\" (UID: \"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1\") " pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.687629 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.687519 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-s6tg6" Apr 20 20:10:39.815160 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:39.815131 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-s6tg6"] Apr 20 20:10:39.818069 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:10:39.818037 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e654a96_5dff_4bf5_bbaa_d9360ce9afb1.slice/crio-76f818395857be6fad0106592bd29bbe9951f62dfd04f9228bd1ad9a18068169 WatchSource:0}: Error finding container 76f818395857be6fad0106592bd29bbe9951f62dfd04f9228bd1ad9a18068169: Status 404 returned error can't find the container with id 76f818395857be6fad0106592bd29bbe9951f62dfd04f9228bd1ad9a18068169 Apr 20 20:10:40.670033 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:40.669989 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s6tg6" event={"ID":"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1","Type":"ContainerStarted","Data":"76f818395857be6fad0106592bd29bbe9951f62dfd04f9228bd1ad9a18068169"} Apr 20 20:10:44.684189 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:44.684154 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-s6tg6" event={"ID":"6e654a96-5dff-4bf5-bbaa-d9360ce9afb1","Type":"ContainerStarted","Data":"24ce4536c3735965045e0dd08901c98b87e9531d7a6f135d347606877faeac3f"} Apr 20 20:10:44.699299 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:10:44.699222 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-s6tg6" podStartSLOduration=1.341854804 podStartE2EDuration="5.699201647s" podCreationTimestamp="2026-04-20 20:10:39 +0000 UTC" firstStartedPulling="2026-04-20 20:10:39.819767149 +0000 UTC m=+274.667310168" lastFinishedPulling="2026-04-20 20:10:44.177113992 +0000 UTC m=+279.024657011" observedRunningTime="2026-04-20 20:10:44.698233568 +0000 UTC m=+279.545776609" watchObservedRunningTime="2026-04-20 20:10:44.699201647 +0000 UTC m=+279.546744688" Apr 20 20:11:05.598140 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:05.597519 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:11:05.603505 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:05.603465 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:11:05.610348 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:05.610318 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:11:05.610995 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:05.610977 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:11:05.613516 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:05.613494 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 20:11:17.954289 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:17.954238 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:11:17.971770 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:17.971740 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:11:18.813397 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:11:18.813370 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 20:14:00.089119 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.089084 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-f8kzd"] Apr 20 20:14:00.091589 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.091569 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f8kzd" Apr 20 20:14:00.094025 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.093998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-h9zb4\"" Apr 20 20:14:00.094153 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.094046 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 20 20:14:00.094926 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.094908 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 20 20:14:00.095015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.094908 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 20 20:14:00.098821 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.098777 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-f8kzd"] Apr 20 20:14:00.207286 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.207251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d25gc\" (UniqueName: \"kubernetes.io/projected/65c852ab-e23c-484a-8b33-81cc2f99b919-kube-api-access-d25gc\") pod \"s3-init-f8kzd\" (UID: \"65c852ab-e23c-484a-8b33-81cc2f99b919\") " pod="kserve/s3-init-f8kzd" Apr 20 20:14:00.307743 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.307706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d25gc\" (UniqueName: \"kubernetes.io/projected/65c852ab-e23c-484a-8b33-81cc2f99b919-kube-api-access-d25gc\") pod \"s3-init-f8kzd\" (UID: \"65c852ab-e23c-484a-8b33-81cc2f99b919\") " pod="kserve/s3-init-f8kzd" Apr 20 20:14:00.316767 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.316733 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d25gc\" (UniqueName: \"kubernetes.io/projected/65c852ab-e23c-484a-8b33-81cc2f99b919-kube-api-access-d25gc\") pod \"s3-init-f8kzd\" (UID: \"65c852ab-e23c-484a-8b33-81cc2f99b919\") " pod="kserve/s3-init-f8kzd" Apr 20 20:14:00.416748 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.416654 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f8kzd" Apr 20 20:14:00.542816 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.542789 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-f8kzd"] Apr 20 20:14:00.545537 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:14:00.545504 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65c852ab_e23c_484a_8b33_81cc2f99b919.slice/crio-8140fb541d9f521c76499c403e83ea9e7fcefc6c55d485d48eb523c917bd7e58 WatchSource:0}: Error finding container 8140fb541d9f521c76499c403e83ea9e7fcefc6c55d485d48eb523c917bd7e58: Status 404 returned error can't find the container with id 8140fb541d9f521c76499c403e83ea9e7fcefc6c55d485d48eb523c917bd7e58 Apr 20 20:14:00.547519 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:00.547503 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:14:01.301693 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:01.301650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f8kzd" event={"ID":"65c852ab-e23c-484a-8b33-81cc2f99b919","Type":"ContainerStarted","Data":"8140fb541d9f521c76499c403e83ea9e7fcefc6c55d485d48eb523c917bd7e58"} Apr 20 20:14:05.319057 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:05.319011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f8kzd" event={"ID":"65c852ab-e23c-484a-8b33-81cc2f99b919","Type":"ContainerStarted","Data":"61b39947665f660a607fce448104a1708adb90f217aff2a3ad77da343378df72"} Apr 20 20:14:05.333582 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:05.333515 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-f8kzd" podStartSLOduration=0.784649094 podStartE2EDuration="5.333496878s" podCreationTimestamp="2026-04-20 20:14:00 +0000 UTC" firstStartedPulling="2026-04-20 20:14:00.547626622 +0000 UTC m=+475.395169645" lastFinishedPulling="2026-04-20 20:14:05.096474403 +0000 UTC m=+479.944017429" observedRunningTime="2026-04-20 20:14:05.333205369 +0000 UTC m=+480.180748411" watchObservedRunningTime="2026-04-20 20:14:05.333496878 +0000 UTC m=+480.181039921" Apr 20 20:14:09.332277 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:09.332239 2579 generic.go:358] "Generic (PLEG): container finished" podID="65c852ab-e23c-484a-8b33-81cc2f99b919" containerID="61b39947665f660a607fce448104a1708adb90f217aff2a3ad77da343378df72" exitCode=0 Apr 20 20:14:09.332702 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:09.332289 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f8kzd" event={"ID":"65c852ab-e23c-484a-8b33-81cc2f99b919","Type":"ContainerDied","Data":"61b39947665f660a607fce448104a1708adb90f217aff2a3ad77da343378df72"} Apr 20 20:14:10.467927 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:10.467900 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f8kzd" Apr 20 20:14:10.499779 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:10.499744 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d25gc\" (UniqueName: \"kubernetes.io/projected/65c852ab-e23c-484a-8b33-81cc2f99b919-kube-api-access-d25gc\") pod \"65c852ab-e23c-484a-8b33-81cc2f99b919\" (UID: \"65c852ab-e23c-484a-8b33-81cc2f99b919\") " Apr 20 20:14:10.502141 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:10.502108 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65c852ab-e23c-484a-8b33-81cc2f99b919-kube-api-access-d25gc" (OuterVolumeSpecName: "kube-api-access-d25gc") pod "65c852ab-e23c-484a-8b33-81cc2f99b919" (UID: "65c852ab-e23c-484a-8b33-81cc2f99b919"). InnerVolumeSpecName "kube-api-access-d25gc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:14:10.600636 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:10.600540 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d25gc\" (UniqueName: \"kubernetes.io/projected/65c852ab-e23c-484a-8b33-81cc2f99b919-kube-api-access-d25gc\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:14:11.340511 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:11.340472 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-f8kzd" event={"ID":"65c852ab-e23c-484a-8b33-81cc2f99b919","Type":"ContainerDied","Data":"8140fb541d9f521c76499c403e83ea9e7fcefc6c55d485d48eb523c917bd7e58"} Apr 20 20:14:11.340511 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:11.340505 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-f8kzd" Apr 20 20:14:11.340736 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:14:11.340513 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8140fb541d9f521c76499c403e83ea9e7fcefc6c55d485d48eb523c917bd7e58" Apr 20 20:16:05.628754 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:16:05.628727 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:16:05.629296 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:16:05.628896 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:16:05.639710 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:16:05.639683 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:16:05.640205 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:16:05.640184 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:21:05.658784 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:21:05.658711 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:21:05.660524 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:21:05.660494 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:21:05.666453 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:21:05.666427 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:21:05.668692 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:21:05.668672 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:26:05.685315 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:26:05.685281 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:26:05.687282 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:26:05.687260 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:26:05.691844 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:26:05.691825 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:26:05.693629 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:26:05.693611 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:28:02.626745 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.626709 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-l8sfp/must-gather-8c84x"] Apr 20 20:28:02.628092 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.627084 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="65c852ab-e23c-484a-8b33-81cc2f99b919" containerName="s3-init" Apr 20 20:28:02.628092 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.627097 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="65c852ab-e23c-484a-8b33-81cc2f99b919" containerName="s3-init" Apr 20 20:28:02.628092 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.627171 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="65c852ab-e23c-484a-8b33-81cc2f99b919" containerName="s3-init" Apr 20 20:28:02.629241 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.629221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:02.631695 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.631667 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l8sfp\"/\"openshift-service-ca.crt\"" Apr 20 20:28:02.631925 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.631673 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-l8sfp\"/\"kube-root-ca.crt\"" Apr 20 20:28:02.632571 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.632553 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-l8sfp\"/\"default-dockercfg-46sjq\"" Apr 20 20:28:02.639262 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.639234 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l8sfp/must-gather-8c84x"] Apr 20 20:28:02.739608 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.739552 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctxjv\" (UniqueName: \"kubernetes.io/projected/af790c38-e05e-44ab-8d93-d9a45a0df941-kube-api-access-ctxjv\") pod \"must-gather-8c84x\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:02.739608 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.739601 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/af790c38-e05e-44ab-8d93-d9a45a0df941-must-gather-output\") pod \"must-gather-8c84x\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:02.840249 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.840207 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctxjv\" (UniqueName: \"kubernetes.io/projected/af790c38-e05e-44ab-8d93-d9a45a0df941-kube-api-access-ctxjv\") pod \"must-gather-8c84x\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:02.840439 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.840258 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/af790c38-e05e-44ab-8d93-d9a45a0df941-must-gather-output\") pod \"must-gather-8c84x\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:02.840646 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.840627 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/af790c38-e05e-44ab-8d93-d9a45a0df941-must-gather-output\") pod \"must-gather-8c84x\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:02.848096 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.848061 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctxjv\" (UniqueName: \"kubernetes.io/projected/af790c38-e05e-44ab-8d93-d9a45a0df941-kube-api-access-ctxjv\") pod \"must-gather-8c84x\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:02.947189 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:02.947090 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:03.078745 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:03.078719 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-l8sfp/must-gather-8c84x"] Apr 20 20:28:03.081245 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:28:03.081214 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf790c38_e05e_44ab_8d93_d9a45a0df941.slice/crio-8d1899ba5c626f63240cd2d1dfc5f67af24e7e09d39a83cdec2132a082a0c933 WatchSource:0}: Error finding container 8d1899ba5c626f63240cd2d1dfc5f67af24e7e09d39a83cdec2132a082a0c933: Status 404 returned error can't find the container with id 8d1899ba5c626f63240cd2d1dfc5f67af24e7e09d39a83cdec2132a082a0c933 Apr 20 20:28:03.083304 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:03.083280 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 20:28:03.985448 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:03.985388 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8sfp/must-gather-8c84x" event={"ID":"af790c38-e05e-44ab-8d93-d9a45a0df941","Type":"ContainerStarted","Data":"8d1899ba5c626f63240cd2d1dfc5f67af24e7e09d39a83cdec2132a082a0c933"} Apr 20 20:28:09.004063 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:09.004003 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8sfp/must-gather-8c84x" event={"ID":"af790c38-e05e-44ab-8d93-d9a45a0df941","Type":"ContainerStarted","Data":"0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892"} Apr 20 20:28:09.004063 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:09.004050 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8sfp/must-gather-8c84x" event={"ID":"af790c38-e05e-44ab-8d93-d9a45a0df941","Type":"ContainerStarted","Data":"168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976"} Apr 20 20:28:09.021781 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:09.021719 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-l8sfp/must-gather-8c84x" podStartSLOduration=1.884765146 podStartE2EDuration="7.021701616s" podCreationTimestamp="2026-04-20 20:28:02 +0000 UTC" firstStartedPulling="2026-04-20 20:28:03.083437475 +0000 UTC m=+1317.930980497" lastFinishedPulling="2026-04-20 20:28:08.220373944 +0000 UTC m=+1323.067916967" observedRunningTime="2026-04-20 20:28:09.019128975 +0000 UTC m=+1323.866672016" watchObservedRunningTime="2026-04-20 20:28:09.021701616 +0000 UTC m=+1323.869244689" Apr 20 20:28:27.070745 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:27.070710 2579 generic.go:358] "Generic (PLEG): container finished" podID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerID="168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976" exitCode=0 Apr 20 20:28:27.071205 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:27.070791 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-l8sfp/must-gather-8c84x" event={"ID":"af790c38-e05e-44ab-8d93-d9a45a0df941","Type":"ContainerDied","Data":"168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976"} Apr 20 20:28:27.071205 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:27.071182 2579 scope.go:117] "RemoveContainer" containerID="168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976" Apr 20 20:28:27.885002 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:27.884967 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l8sfp_must-gather-8c84x_af790c38-e05e-44ab-8d93-d9a45a0df941/gather/0.log" Apr 20 20:28:31.142921 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:31.142890 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-s6tg6_6e654a96-5dff-4bf5-bbaa-d9360ce9afb1/global-pull-secret-syncer/0.log" Apr 20 20:28:31.303343 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:31.303302 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-t7s46_bb961195-80bd-4743-b8f8-8b5ed4db814c/konnectivity-agent/0.log" Apr 20 20:28:31.370597 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:31.370562 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-130.ec2.internal_d5ddc06d76dc03ab1fc5182830ef0d45/haproxy/0.log" Apr 20 20:28:33.263822 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.263782 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-l8sfp/must-gather-8c84x"] Apr 20 20:28:33.264297 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.264024 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-l8sfp/must-gather-8c84x" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerName="copy" containerID="cri-o://0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892" gracePeriod=2 Apr 20 20:28:33.266273 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.266234 2579 status_manager.go:895] "Failed to get status for pod" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" pod="openshift-must-gather-l8sfp/must-gather-8c84x" err="pods \"must-gather-8c84x\" is forbidden: User \"system:node:ip-10-0-141-130.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-l8sfp\": no relationship found between node 'ip-10-0-141-130.ec2.internal' and this object" Apr 20 20:28:33.267300 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.267274 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-l8sfp/must-gather-8c84x"] Apr 20 20:28:33.507247 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.507221 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l8sfp_must-gather-8c84x_af790c38-e05e-44ab-8d93-d9a45a0df941/copy/0.log" Apr 20 20:28:33.507634 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.507617 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:33.509487 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.509460 2579 status_manager.go:895] "Failed to get status for pod" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" pod="openshift-must-gather-l8sfp/must-gather-8c84x" err="pods \"must-gather-8c84x\" is forbidden: User \"system:node:ip-10-0-141-130.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-l8sfp\": no relationship found between node 'ip-10-0-141-130.ec2.internal' and this object" Apr 20 20:28:33.622795 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.622703 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/af790c38-e05e-44ab-8d93-d9a45a0df941-must-gather-output\") pod \"af790c38-e05e-44ab-8d93-d9a45a0df941\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " Apr 20 20:28:33.622981 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.622821 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctxjv\" (UniqueName: \"kubernetes.io/projected/af790c38-e05e-44ab-8d93-d9a45a0df941-kube-api-access-ctxjv\") pod \"af790c38-e05e-44ab-8d93-d9a45a0df941\" (UID: \"af790c38-e05e-44ab-8d93-d9a45a0df941\") " Apr 20 20:28:33.624662 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.624624 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af790c38-e05e-44ab-8d93-d9a45a0df941-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "af790c38-e05e-44ab-8d93-d9a45a0df941" (UID: "af790c38-e05e-44ab-8d93-d9a45a0df941"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 20:28:33.625314 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.625279 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af790c38-e05e-44ab-8d93-d9a45a0df941-kube-api-access-ctxjv" (OuterVolumeSpecName: "kube-api-access-ctxjv") pod "af790c38-e05e-44ab-8d93-d9a45a0df941" (UID: "af790c38-e05e-44ab-8d93-d9a45a0df941"). InnerVolumeSpecName "kube-api-access-ctxjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 20:28:33.723741 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.723701 2579 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/af790c38-e05e-44ab-8d93-d9a45a0df941-must-gather-output\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:28:33.723741 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.723733 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctxjv\" (UniqueName: \"kubernetes.io/projected/af790c38-e05e-44ab-8d93-d9a45a0df941-kube-api-access-ctxjv\") on node \"ip-10-0-141-130.ec2.internal\" DevicePath \"\"" Apr 20 20:28:33.736724 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:33.736685 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" path="/var/lib/kubelet/pods/af790c38-e05e-44ab-8d93-d9a45a0df941/volumes" Apr 20 20:28:34.094954 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.094926 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-l8sfp_must-gather-8c84x_af790c38-e05e-44ab-8d93-d9a45a0df941/copy/0.log" Apr 20 20:28:34.095264 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.095238 2579 generic.go:358] "Generic (PLEG): container finished" podID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerID="0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892" exitCode=143 Apr 20 20:28:34.095322 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.095293 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-l8sfp/must-gather-8c84x" Apr 20 20:28:34.095365 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.095348 2579 scope.go:117] "RemoveContainer" containerID="0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892" Apr 20 20:28:34.103573 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.103552 2579 scope.go:117] "RemoveContainer" containerID="168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976" Apr 20 20:28:34.117555 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.117529 2579 scope.go:117] "RemoveContainer" containerID="0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892" Apr 20 20:28:34.117928 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:28:34.117868 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892\": container with ID starting with 0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892 not found: ID does not exist" containerID="0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892" Apr 20 20:28:34.118022 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.117946 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892"} err="failed to get container status \"0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892\": rpc error: code = NotFound desc = could not find container \"0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892\": container with ID starting with 0f1353842446e74fac735d56676675acc0254a7247d832a5f9031a453f17d892 not found: ID does not exist" Apr 20 20:28:34.118022 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.117977 2579 scope.go:117] "RemoveContainer" containerID="168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976" Apr 20 20:28:34.118265 ip-10-0-141-130 kubenswrapper[2579]: E0420 20:28:34.118247 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976\": container with ID starting with 168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976 not found: ID does not exist" containerID="168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976" Apr 20 20:28:34.118321 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.118274 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976"} err="failed to get container status \"168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976\": rpc error: code = NotFound desc = could not find container \"168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976\": container with ID starting with 168d7a4166b0d8e331466e1ddd39bb4045fc186e00ea0cdbd9a91e19c7f0d976 not found: ID does not exist" Apr 20 20:28:34.703501 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.703463 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-pfg5q_c79c4964-96fa-4fde-9a9d-7928a8841145/cluster-monitoring-operator/0.log" Apr 20 20:28:34.827457 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.827429 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-57958c75ff-6lnmd_4e5316e3-2107-49b1-b744-bd92a811e921/metrics-server/0.log" Apr 20 20:28:34.857683 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:34.857655 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-n95gd_4031d84e-286b-453d-b9bf-80024787b990/monitoring-plugin/0.log" Apr 20 20:28:35.069107 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.069067 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m5wpz_e7242252-d37d-4f22-8f26-227293d15aed/node-exporter/0.log" Apr 20 20:28:35.090775 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.090728 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m5wpz_e7242252-d37d-4f22-8f26-227293d15aed/kube-rbac-proxy/0.log" Apr 20 20:28:35.114327 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.114301 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-m5wpz_e7242252-d37d-4f22-8f26-227293d15aed/init-textfile/0.log" Apr 20 20:28:35.234491 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.234461 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_791328e6-9d30-4dfc-9c1e-166eb176bc8b/prometheus/0.log" Apr 20 20:28:35.255629 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.255584 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_791328e6-9d30-4dfc-9c1e-166eb176bc8b/config-reloader/0.log" Apr 20 20:28:35.280950 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.280915 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_791328e6-9d30-4dfc-9c1e-166eb176bc8b/thanos-sidecar/0.log" Apr 20 20:28:35.304687 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.304583 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_791328e6-9d30-4dfc-9c1e-166eb176bc8b/kube-rbac-proxy-web/0.log" Apr 20 20:28:35.330490 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.330443 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_791328e6-9d30-4dfc-9c1e-166eb176bc8b/kube-rbac-proxy/0.log" Apr 20 20:28:35.352906 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.352860 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_791328e6-9d30-4dfc-9c1e-166eb176bc8b/kube-rbac-proxy-thanos/0.log" Apr 20 20:28:35.377744 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.377718 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_791328e6-9d30-4dfc-9c1e-166eb176bc8b/init-config-reloader/0.log" Apr 20 20:28:35.462943 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.462913 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-qbjkd_f139f953-09ab-4649-bfd3-df6122181f1a/prometheus-operator-admission-webhook/0.log" Apr 20 20:28:35.493834 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.493779 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66d56bfd8-82vtl_1163c73f-65e4-431f-926c-dbbdb381939d/telemeter-client/0.log" Apr 20 20:28:35.516807 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.516769 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66d56bfd8-82vtl_1163c73f-65e4-431f-926c-dbbdb381939d/reload/0.log" Apr 20 20:28:35.540378 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.540344 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-66d56bfd8-82vtl_1163c73f-65e4-431f-926c-dbbdb381939d/kube-rbac-proxy/0.log" Apr 20 20:28:35.573497 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.573427 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-668f954756-5c6pq_df8856cf-ad0c-428d-936d-9ac93ef7d700/thanos-query/0.log" Apr 20 20:28:35.598689 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.598658 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-668f954756-5c6pq_df8856cf-ad0c-428d-936d-9ac93ef7d700/kube-rbac-proxy-web/0.log" Apr 20 20:28:35.622164 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.622116 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-668f954756-5c6pq_df8856cf-ad0c-428d-936d-9ac93ef7d700/kube-rbac-proxy/0.log" Apr 20 20:28:35.645730 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.645701 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-668f954756-5c6pq_df8856cf-ad0c-428d-936d-9ac93ef7d700/prom-label-proxy/0.log" Apr 20 20:28:35.669312 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.669278 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-668f954756-5c6pq_df8856cf-ad0c-428d-936d-9ac93ef7d700/kube-rbac-proxy-rules/0.log" Apr 20 20:28:35.692732 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:35.692699 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-668f954756-5c6pq_df8856cf-ad0c-428d-936d-9ac93ef7d700/kube-rbac-proxy-metrics/0.log" Apr 20 20:28:36.852125 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:36.852095 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-wwct6_157e2b24-16fe-402b-ad88-6a65a8c662f5/networking-console-plugin/0.log" Apr 20 20:28:37.313010 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:37.312972 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/1.log" Apr 20 20:28:37.317418 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:37.317398 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-69djd_fc299172-ad2d-4bef-a2b2-de75054e20b5/console-operator/2.log" Apr 20 20:28:37.769527 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:37.769498 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-2fvv4_c52fae83-c67e-4e0e-a29f-7df47f11231f/download-server/0.log" Apr 20 20:28:38.190376 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.190294 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-sxmj9_43faf1e3-dfc2-49bf-807e-cef46ba72873/volume-data-source-validator/0.log" Apr 20 20:28:38.860329 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.860297 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-54k7f_a614e742-616a-48e4-bb64-1c023ed6fecf/dns/0.log" Apr 20 20:28:38.880717 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.880687 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-54k7f_a614e742-616a-48e4-bb64-1c023ed6fecf/kube-rbac-proxy/0.log" Apr 20 20:28:38.939511 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.939474 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm"] Apr 20 20:28:38.939837 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.939824 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerName="copy" Apr 20 20:28:38.939894 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.939839 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerName="copy" Apr 20 20:28:38.939894 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.939854 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerName="gather" Apr 20 20:28:38.939894 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.939860 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerName="gather" Apr 20 20:28:38.939988 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.939935 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerName="gather" Apr 20 20:28:38.939988 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.939943 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="af790c38-e05e-44ab-8d93-d9a45a0df941" containerName="copy" Apr 20 20:28:38.945300 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.945277 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:38.954374 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.948581 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxgfw\"/\"kube-root-ca.crt\"" Apr 20 20:28:38.954374 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.950067 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxgfw\"/\"openshift-service-ca.crt\"" Apr 20 20:28:38.954374 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.950362 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kxgfw\"/\"default-dockercfg-f8tg6\"" Apr 20 20:28:38.954374 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:38.953289 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm"] Apr 20 20:28:39.065600 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.065559 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-n52n7_f4f29d83-69da-4bc7-a3ce-9bcc03a224ed/dns-node-resolver/0.log" Apr 20 20:28:39.073312 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.073266 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-proc\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.073454 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.073337 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-podres\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.073454 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.073407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-sys\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.073454 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.073449 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-lib-modules\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.073644 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.073500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkd2r\" (UniqueName: \"kubernetes.io/projected/0a803e74-4314-4380-91fe-e49fd26b9d32-kube-api-access-kkd2r\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.174937 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.174823 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-proc\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.174937 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.174904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-podres\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.174937 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.174927 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-sys\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.175184 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.174944 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-lib-modules\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.175184 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.174958 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-proc\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.175184 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.174976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kkd2r\" (UniqueName: \"kubernetes.io/projected/0a803e74-4314-4380-91fe-e49fd26b9d32-kube-api-access-kkd2r\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.175184 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.175042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-sys\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.175184 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.175062 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-lib-modules\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.175184 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.175076 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0a803e74-4314-4380-91fe-e49fd26b9d32-podres\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.182713 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.182674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkd2r\" (UniqueName: \"kubernetes.io/projected/0a803e74-4314-4380-91fe-e49fd26b9d32-kube-api-access-kkd2r\") pod \"perf-node-gather-daemonset-2nbcm\" (UID: \"0a803e74-4314-4380-91fe-e49fd26b9d32\") " pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.263127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.263092 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:39.391323 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.391293 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm"] Apr 20 20:28:39.393726 ip-10-0-141-130 kubenswrapper[2579]: W0420 20:28:39.393694 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0a803e74_4314_4380_91fe_e49fd26b9d32.slice/crio-a422b43edb0566d1cb4eaeff3b0769aaec454c430aca757f4dc1ef2714fba362 WatchSource:0}: Error finding container a422b43edb0566d1cb4eaeff3b0769aaec454c430aca757f4dc1ef2714fba362: Status 404 returned error can't find the container with id a422b43edb0566d1cb4eaeff3b0769aaec454c430aca757f4dc1ef2714fba362 Apr 20 20:28:39.473003 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.472978 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6598fdf965-7nq6v_609404ae-d83c-4bce-97c6-0c193b6b0485/registry/0.log" Apr 20 20:28:39.491803 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:39.491777 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7t5kf_a545fd66-1350-4958-93ed-a017391ba8d7/node-ca/0.log" Apr 20 20:28:40.115721 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:40.115683 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" event={"ID":"0a803e74-4314-4380-91fe-e49fd26b9d32","Type":"ContainerStarted","Data":"ced14f913b66973d49d0a48082895f4970e789ff27f618be7c1e810976bcde8d"} Apr 20 20:28:40.115721 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:40.115726 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" event={"ID":"0a803e74-4314-4380-91fe-e49fd26b9d32","Type":"ContainerStarted","Data":"a422b43edb0566d1cb4eaeff3b0769aaec454c430aca757f4dc1ef2714fba362"} Apr 20 20:28:40.116014 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:40.115773 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:40.133318 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:40.133258 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" podStartSLOduration=2.133243031 podStartE2EDuration="2.133243031s" podCreationTimestamp="2026-04-20 20:28:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 20:28:40.131171316 +0000 UTC m=+1354.978714356" watchObservedRunningTime="2026-04-20 20:28:40.133243031 +0000 UTC m=+1354.980786137" Apr 20 20:28:40.581421 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:40.581390 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-zq94z_ac55a3c1-11b3-4209-b73c-91b790e26c17/serve-healthcheck-canary/0.log" Apr 20 20:28:40.997127 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:40.997076 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gdfgp_babd65b4-ae72-4241-ac04-d1773d2684e6/kube-rbac-proxy/0.log" Apr 20 20:28:41.016577 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:41.016552 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gdfgp_babd65b4-ae72-4241-ac04-d1773d2684e6/exporter/0.log" Apr 20 20:28:41.037015 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:41.036988 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gdfgp_babd65b4-ae72-4241-ac04-d1773d2684e6/extractor/0.log" Apr 20 20:28:43.206056 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:43.206027 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-f8kzd_65c852ab-e23c-484a-8b33-81cc2f99b919/s3-init/0.log" Apr 20 20:28:46.129804 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:46.129765 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kxgfw/perf-node-gather-daemonset-2nbcm" Apr 20 20:28:47.110238 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:47.110203 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-l42zw_b41883eb-364a-40b1-af2a-729a431739b1/kube-storage-version-migrator-operator/1.log" Apr 20 20:28:47.111026 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:47.111006 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-l42zw_b41883eb-364a-40b1-af2a-729a431739b1/kube-storage-version-migrator-operator/0.log" Apr 20 20:28:48.173094 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.173007 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7mjbn_7307693c-15c5-4f2a-ac49-7a1626eb5a0d/kube-multus/0.log" Apr 20 20:28:48.524088 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.524058 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vrb5p_9774d5fe-006d-4f52-a1aa-6ab55dcf9946/kube-multus-additional-cni-plugins/0.log" Apr 20 20:28:48.548088 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.548061 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vrb5p_9774d5fe-006d-4f52-a1aa-6ab55dcf9946/egress-router-binary-copy/0.log" Apr 20 20:28:48.572393 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.572356 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vrb5p_9774d5fe-006d-4f52-a1aa-6ab55dcf9946/cni-plugins/0.log" Apr 20 20:28:48.596407 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.596384 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vrb5p_9774d5fe-006d-4f52-a1aa-6ab55dcf9946/bond-cni-plugin/0.log" Apr 20 20:28:48.620319 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.620291 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vrb5p_9774d5fe-006d-4f52-a1aa-6ab55dcf9946/routeoverride-cni/0.log" Apr 20 20:28:48.639674 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.639646 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vrb5p_9774d5fe-006d-4f52-a1aa-6ab55dcf9946/whereabouts-cni-bincopy/0.log" Apr 20 20:28:48.661577 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.661545 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-vrb5p_9774d5fe-006d-4f52-a1aa-6ab55dcf9946/whereabouts-cni/0.log" Apr 20 20:28:48.785371 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.785294 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g7wqd_5020f6ce-7061-484f-9b7d-89141a36e42c/network-metrics-daemon/0.log" Apr 20 20:28:48.807698 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:48.807667 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-g7wqd_5020f6ce-7061-484f-9b7d-89141a36e42c/kube-rbac-proxy/0.log" Apr 20 20:28:49.493244 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.493215 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-controller/0.log" Apr 20 20:28:49.512543 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.512514 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/0.log" Apr 20 20:28:49.518775 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.518752 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovn-acl-logging/1.log" Apr 20 20:28:49.538024 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.537994 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/kube-rbac-proxy-node/0.log" Apr 20 20:28:49.560093 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.560064 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 20:28:49.580522 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.580495 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/northd/0.log" Apr 20 20:28:49.601567 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.601533 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/nbdb/0.log" Apr 20 20:28:49.622554 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.622529 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/sbdb/0.log" Apr 20 20:28:49.716543 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:49.716507 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-48d2t_1e81f4c9-5ef3-4e1c-b95a-2cb9fe8e16ac/ovnkube-controller/0.log" Apr 20 20:28:51.259802 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:51.259767 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-6m4mg_7f4a42ba-8098-4e0b-b398-f97f5e0c71d3/check-endpoints/0.log" Apr 20 20:28:51.307858 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:51.307826 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ncwtr_f462fd2a-e95e-4bb6-b40a-76ad1cf91759/network-check-target-container/0.log" Apr 20 20:28:52.206842 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:52.206808 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-nqld6_0cc0ab62-60fe-454e-bf85-90ca6410b3d1/iptables-alerter/0.log" Apr 20 20:28:52.905522 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:52.905490 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-qktfm_f38465e3-0b1f-43c3-9a2b-3f294c21b82a/tuned/0.log" Apr 20 20:28:54.625486 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:54.625450 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-n4p5j_17c337e8-f179-4bd2-b623-1741863e9278/cluster-samples-operator/0.log" Apr 20 20:28:54.641621 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:54.641599 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-n4p5j_17c337e8-f179-4bd2-b623-1741863e9278/cluster-samples-operator-watch/0.log" Apr 20 20:28:55.569135 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:55.569096 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-28z6v_09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd/service-ca-operator/1.log" Apr 20 20:28:55.569906 ip-10-0-141-130 kubenswrapper[2579]: I0420 20:28:55.569869 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-28z6v_09d9eaa7-0f7e-4ce2-9009-ccea0b0f2ddd/service-ca-operator/0.log"