Apr 16 18:27:43.480413 ip-10-0-139-49 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 18:27:43.480425 ip-10-0-139-49 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 18:27:43.480434 ip-10-0-139-49 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 18:27:43.480730 ip-10-0-139-49 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 18:27:53.599053 ip-10-0-139-49 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 18:27:53.599073 ip-10-0-139-49 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot fe36193b218d4bc89bb0869554d11cc7 -- Apr 16 18:30:19.498172 ip-10-0-139-49 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:30:19.946487 ip-10-0-139-49 kubenswrapper[2577]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:19.946487 ip-10-0-139-49 kubenswrapper[2577]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:30:19.946487 ip-10-0-139-49 kubenswrapper[2577]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:19.946487 ip-10-0-139-49 kubenswrapper[2577]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:30:19.946487 ip-10-0-139-49 kubenswrapper[2577]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:30:19.948089 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.947995 2577 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:30:19.953685 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953665 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:19.953685 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953681 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:19.953685 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953687 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:19.953685 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953692 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953696 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953702 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953706 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953710 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953714 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953719 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953723 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953727 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953730 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953734 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953738 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953742 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953746 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953750 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953754 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953759 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953762 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953766 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:19.953952 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953773 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953778 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953783 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953787 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953791 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953795 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953800 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953804 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953809 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953813 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953817 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953822 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953826 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953841 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953845 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953851 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953871 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953876 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953880 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953885 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:19.954716 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953889 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953896 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953900 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953904 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953909 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953913 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953917 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953922 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953927 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953931 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953936 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953940 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953944 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953949 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953953 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953956 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953960 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953964 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953968 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953975 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:19.955488 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953982 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953986 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953991 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.953996 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954000 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954004 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954010 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954015 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954019 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954024 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954028 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954033 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954037 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954042 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954046 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954052 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954056 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954060 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954064 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:19.956029 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954069 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954073 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954077 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954082 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954086 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954722 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954732 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954736 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954740 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954744 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954749 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954753 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954758 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954762 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954766 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954771 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954775 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954779 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954785 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954794 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:19.956816 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954802 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954810 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954814 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954819 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954823 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954827 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954831 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954837 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954841 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954845 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954849 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954854 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954874 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954879 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954886 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954892 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954897 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954902 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954907 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:19.957374 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954911 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954916 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954921 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954926 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954933 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954937 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954941 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954945 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954950 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954954 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954958 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954963 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954967 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954973 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954978 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954982 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954986 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954990 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.954994 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:19.957899 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955001 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955005 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955009 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955013 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955018 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955022 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955026 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955030 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955035 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955039 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955043 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955047 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955051 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955056 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955060 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955064 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955069 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955073 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955078 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955082 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:19.958373 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955086 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955090 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955095 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955099 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955103 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955107 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955112 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955117 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955122 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955126 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955130 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955134 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.955138 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.955981 2577 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.955997 2577 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956008 2577 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956015 2577 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956022 2577 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956026 2577 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956034 2577 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956041 2577 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:30:19.958898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956046 2577 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956051 2577 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956057 2577 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956062 2577 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956067 2577 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956072 2577 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956077 2577 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956082 2577 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956087 2577 flags.go:64] FLAG: --cloud-config="" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956092 2577 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956097 2577 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956107 2577 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956112 2577 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956118 2577 flags.go:64] FLAG: --config-dir="" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956122 2577 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956128 2577 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956135 2577 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956140 2577 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956146 2577 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956153 2577 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956158 2577 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956163 2577 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956168 2577 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956173 2577 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956178 2577 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:30:19.959772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956185 2577 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956190 2577 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956196 2577 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956201 2577 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956206 2577 flags.go:64] FLAG: --enable-server="true" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956211 2577 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956217 2577 flags.go:64] FLAG: --event-burst="100" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956223 2577 flags.go:64] FLAG: --event-qps="50" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956228 2577 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956234 2577 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956239 2577 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956245 2577 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956250 2577 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956255 2577 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956260 2577 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956265 2577 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956270 2577 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956275 2577 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956280 2577 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956285 2577 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956290 2577 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956294 2577 flags.go:64] FLAG: --feature-gates="" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956301 2577 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956306 2577 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956312 2577 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:30:19.960607 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956318 2577 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956322 2577 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956329 2577 flags.go:64] FLAG: --help="false" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956334 2577 flags.go:64] FLAG: --hostname-override="ip-10-0-139-49.ec2.internal" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956339 2577 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956345 2577 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956350 2577 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956355 2577 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956361 2577 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956366 2577 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956371 2577 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956376 2577 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956380 2577 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956385 2577 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956392 2577 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956397 2577 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956402 2577 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956407 2577 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956412 2577 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956416 2577 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956422 2577 flags.go:64] FLAG: --lock-file="" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956427 2577 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956432 2577 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956437 2577 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:30:19.961247 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956445 2577 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956450 2577 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956455 2577 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956460 2577 flags.go:64] FLAG: --logging-format="text" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956465 2577 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956471 2577 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956476 2577 flags.go:64] FLAG: --manifest-url="" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956481 2577 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956488 2577 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956493 2577 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956500 2577 flags.go:64] FLAG: --max-pods="110" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956506 2577 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956512 2577 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956517 2577 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956522 2577 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956527 2577 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956532 2577 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956536 2577 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956549 2577 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956554 2577 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956559 2577 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956564 2577 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956569 2577 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:30:19.961824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956576 2577 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956581 2577 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956587 2577 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956592 2577 flags.go:64] FLAG: --port="10250" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956597 2577 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956602 2577 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0d3a05a033f17ded7" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956607 2577 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956615 2577 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956620 2577 flags.go:64] FLAG: --register-node="true" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956625 2577 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956630 2577 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956637 2577 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956641 2577 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956646 2577 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956651 2577 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956657 2577 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956662 2577 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956667 2577 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956672 2577 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956678 2577 flags.go:64] FLAG: --runonce="false" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956682 2577 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956688 2577 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956693 2577 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956698 2577 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956703 2577 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956708 2577 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:30:19.962418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956713 2577 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956718 2577 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956723 2577 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956728 2577 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956733 2577 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956737 2577 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956743 2577 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956748 2577 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956753 2577 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956762 2577 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956767 2577 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956771 2577 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956779 2577 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956785 2577 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956790 2577 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956795 2577 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956800 2577 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956805 2577 flags.go:64] FLAG: --v="2" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956811 2577 flags.go:64] FLAG: --version="false" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956818 2577 flags.go:64] FLAG: --vmodule="" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956824 2577 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.956830 2577 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957026 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957043 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:19.963055 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957046 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957050 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957053 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957056 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957059 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957062 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957064 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957068 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957071 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957075 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957080 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957083 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957086 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957089 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957091 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957094 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957097 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957100 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957103 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:19.963638 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957106 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957109 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957118 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957121 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957123 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957126 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957129 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957131 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957134 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957136 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957139 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957142 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957145 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957148 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957151 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957154 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957156 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957159 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957162 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:19.964129 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957165 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957167 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957170 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957173 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957175 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957178 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957181 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957183 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957186 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957189 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957192 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957195 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957197 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957201 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957203 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957206 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957209 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957211 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957214 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957217 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:19.964599 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957219 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957222 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957224 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957228 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957231 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957234 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957237 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957240 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957242 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957245 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957248 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957251 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957253 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957256 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957258 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957261 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957264 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957266 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957269 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957271 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:19.965115 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957274 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:19.965605 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957276 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:19.965605 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957279 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:19.965605 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957282 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:19.965605 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957286 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:19.965605 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.957290 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:19.965605 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.958010 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:19.968652 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.968629 2577 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:30:19.968694 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.968653 2577 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968702 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968707 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968710 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968714 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968716 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968719 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968722 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968725 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968728 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:19.968726 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968731 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968734 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968737 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968740 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968742 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968745 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968748 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968750 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968753 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968755 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968758 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968761 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968763 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968766 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968769 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968772 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968775 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968777 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968780 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:19.968989 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968782 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968785 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968788 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968790 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968793 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968795 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968799 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968801 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968804 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968806 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968809 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968811 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968814 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968817 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968820 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968823 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968826 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968829 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968831 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968834 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:19.969461 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968837 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968840 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968843 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968846 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968849 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968851 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968854 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968871 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968874 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968877 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968879 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968882 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968886 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968889 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968892 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968894 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968897 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968900 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968903 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968905 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:19.969967 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968908 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968911 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968914 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968919 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968923 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968926 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968929 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968933 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968937 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968940 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968943 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968946 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968948 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968951 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968954 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968957 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968959 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:19.970501 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.968962 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.968967 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969061 2577 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969066 2577 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969069 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969072 2577 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969075 2577 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969078 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969081 2577 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969084 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969087 2577 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969089 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969092 2577 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969096 2577 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969099 2577 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969101 2577 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:30:19.970936 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969104 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969107 2577 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969109 2577 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969112 2577 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969116 2577 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969119 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969122 2577 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969125 2577 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969128 2577 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969131 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969133 2577 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969136 2577 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969139 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969142 2577 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969145 2577 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969147 2577 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969150 2577 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969153 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969156 2577 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:30:19.971331 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969158 2577 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969161 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969163 2577 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969166 2577 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969168 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969171 2577 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969174 2577 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969176 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969179 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969182 2577 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969185 2577 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969188 2577 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969190 2577 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969193 2577 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969196 2577 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969198 2577 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969201 2577 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969203 2577 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969206 2577 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969209 2577 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:30:19.972075 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969212 2577 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969214 2577 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969218 2577 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969220 2577 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969223 2577 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969225 2577 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969228 2577 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969232 2577 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969235 2577 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969238 2577 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969241 2577 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969244 2577 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969246 2577 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969249 2577 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969252 2577 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969272 2577 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969276 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969279 2577 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969283 2577 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969286 2577 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:30:19.972565 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969289 2577 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969292 2577 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969295 2577 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969297 2577 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969300 2577 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969303 2577 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969306 2577 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969308 2577 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969311 2577 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969313 2577 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969316 2577 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969318 2577 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:19.969321 2577 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.969325 2577 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:30:19.973071 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.970287 2577 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:30:19.974034 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.974019 2577 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:30:19.975022 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.975011 2577 server.go:1019] "Starting client certificate rotation" Apr 16 18:30:19.975130 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.975113 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:19.975297 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:19.975287 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:30:20.000615 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.000592 2577 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:20.005363 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.005322 2577 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:30:20.025126 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.025099 2577 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:30:20.030918 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.030898 2577 log.go:25] "Validated CRI v1 image API" Apr 16 18:30:20.032295 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.032275 2577 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:30:20.033120 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.033105 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:20.035969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.035946 2577 fs.go:135] Filesystem UUIDs: map[44481879-4087-4e9c-aa9a-48ff035c71e0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 9254e25f-1f46-4c1b-a4f3-1dd3d9d643c0:/dev/nvme0n1p4] Apr 16 18:30:20.036025 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.035970 2577 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:30:20.041590 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.041344 2577 manager.go:217] Machine: {Timestamp:2026-04-16 18:30:20.040099911 +0000 UTC m=+0.424420223 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3091712 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2778ea7e96a1736dc6e2709dfe5330 SystemUUID:ec2778ea-7e96-a173-6dc6-e2709dfe5330 BootID:fe36193b-218d-4bc8-9bb0-869554d11cc7 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8a:bf:ef:62:05 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8a:bf:ef:62:05 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:c1:68:b2:2f:2e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:30:20.041590 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.041587 2577 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:30:20.041698 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.041672 2577 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:30:20.043349 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.043321 2577 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:30:20.043531 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.043352 2577 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-49.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:30:20.043580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.043543 2577 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:30:20.043580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.043552 2577 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:30:20.043580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.043564 2577 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:20.043661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.043581 2577 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:30:20.044991 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.044980 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:20.045099 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.045090 2577 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:30:20.047458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.047448 2577 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:30:20.047496 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.047466 2577 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:30:20.047496 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.047479 2577 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:30:20.047496 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.047488 2577 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:30:20.047591 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.047506 2577 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:30:20.048593 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.048581 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:20.048634 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.048603 2577 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:30:20.049334 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.049310 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s4m25" Apr 16 18:30:20.051824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.051798 2577 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:30:20.053605 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.053592 2577 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:30:20.055008 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.054997 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:30:20.055049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055013 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:30:20.055049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055020 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:30:20.055049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055026 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:30:20.055049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055032 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:30:20.055049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055038 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:30:20.055049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055044 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:30:20.055213 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055052 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:30:20.055213 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055064 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:30:20.055213 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055071 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:30:20.055213 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055086 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:30:20.055213 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055094 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:30:20.055883 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055871 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:30:20.055883 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.055883 2577 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:30:20.057181 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.057164 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-s4m25" Apr 16 18:30:20.058108 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.058085 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-49.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:30:20.058173 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.058093 2577 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:30:20.059732 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.059719 2577 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:30:20.059776 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.059757 2577 server.go:1295] "Started kubelet" Apr 16 18:30:20.059904 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.059877 2577 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:30:20.059967 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.059884 2577 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:30:20.059967 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.059957 2577 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:30:20.060672 ip-10-0-139-49 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:30:20.061106 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.061088 2577 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:30:20.065295 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.065273 2577 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:30:20.070399 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.070379 2577 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:20.070986 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.070969 2577 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:30:20.071701 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.071666 2577 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:30:20.071783 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.071759 2577 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:30:20.071783 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.071764 2577 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:30:20.071945 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.071786 2577 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:30:20.071945 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.071848 2577 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:30:20.071945 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.071885 2577 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:30:20.072045 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.071940 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.072245 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.072230 2577 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:30:20.072280 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.072247 2577 factory.go:55] Registering systemd factory Apr 16 18:30:20.072280 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.072254 2577 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:30:20.072452 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.072436 2577 factory.go:153] Registering CRI-O factory Apr 16 18:30:20.072452 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.072450 2577 factory.go:223] Registration of the crio container factory successfully Apr 16 18:30:20.072580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.072467 2577 factory.go:103] Registering Raw factory Apr 16 18:30:20.072580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.072478 2577 manager.go:1196] Started watching for new ooms in manager Apr 16 18:30:20.073541 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.073521 2577 manager.go:319] Starting recovery of all containers Apr 16 18:30:20.073952 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.073931 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:20.079201 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.079180 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-49.ec2.internal" not found Apr 16 18:30:20.079201 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.079178 2577 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-49.ec2.internal\" not found" node="ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.083399 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.083379 2577 manager.go:324] Recovery completed Apr 16 18:30:20.087634 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.087622 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:20.089961 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.089947 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:20.090018 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.089989 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:20.090018 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.090002 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:20.090537 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.090523 2577 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:30:20.090537 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.090536 2577 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:30:20.090626 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.090553 2577 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:30:20.093033 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.093020 2577 policy_none.go:49] "None policy: Start" Apr 16 18:30:20.093094 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.093037 2577 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:30:20.093094 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.093047 2577 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:30:20.095979 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.095966 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-49.ec2.internal" not found Apr 16 18:30:20.128469 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.128450 2577 manager.go:341] "Starting Device Plugin manager" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.128484 2577 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.128495 2577 server.go:85] "Starting device plugin registration server" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.128788 2577 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.128800 2577 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.128899 2577 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.128980 2577 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.128988 2577 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.129620 2577 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:30:20.142155 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.129663 2577 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.153432 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.153410 2577 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-49.ec2.internal" not found Apr 16 18:30:20.206979 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.206905 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:30:20.208078 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.208055 2577 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:30:20.208154 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.208088 2577 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:30:20.208154 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.208119 2577 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:30:20.208154 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.208130 2577 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:30:20.208291 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.208176 2577 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:30:20.213109 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.213091 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:20.229840 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.229812 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:20.230980 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.230962 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:20.231086 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.230998 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:20.231086 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.231013 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:20.231086 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.231043 2577 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.243980 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.243957 2577 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.244082 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.243984 2577 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-49.ec2.internal\": node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.303707 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.303677 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.308712 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.308690 2577 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal"] Apr 16 18:30:20.308781 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.308762 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:20.310289 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.310272 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:20.310398 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.310302 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:20.310398 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.310312 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:20.311470 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.311458 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:20.311658 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.311645 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.311700 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.311674 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:20.312236 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.312220 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:20.312312 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.312245 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:20.312312 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.312263 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:20.312312 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.312274 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:20.312312 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.312249 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:20.312467 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.312345 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:20.313980 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.313967 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.314050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.313992 2577 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:30:20.314644 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.314632 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:30:20.314710 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.314655 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:30:20.314710 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.314667 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:30:20.337476 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.337454 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-49.ec2.internal\" not found" node="ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.341753 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.341730 2577 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-49.ec2.internal\" not found" node="ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.374311 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.374285 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb503e0bda4c32012b9b964fc44f2748-config\") pod \"kube-apiserver-proxy-ip-10-0-139-49.ec2.internal\" (UID: \"bb503e0bda4c32012b9b964fc44f2748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.374311 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.374312 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0cd359daaf1ce9c0e639e7e589d4b7ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"0cd359daaf1ce9c0e639e7e589d4b7ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.374479 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.374333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cd359daaf1ce9c0e639e7e589d4b7ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"0cd359daaf1ce9c0e639e7e589d4b7ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.404278 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.404238 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.474719 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.474640 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cd359daaf1ce9c0e639e7e589d4b7ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"0cd359daaf1ce9c0e639e7e589d4b7ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.474719 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.474671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb503e0bda4c32012b9b964fc44f2748-config\") pod \"kube-apiserver-proxy-ip-10-0-139-49.ec2.internal\" (UID: \"bb503e0bda4c32012b9b964fc44f2748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.474719 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.474702 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0cd359daaf1ce9c0e639e7e589d4b7ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"0cd359daaf1ce9c0e639e7e589d4b7ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.474949 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.474745 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0cd359daaf1ce9c0e639e7e589d4b7ef-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"0cd359daaf1ce9c0e639e7e589d4b7ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.474949 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.474752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0cd359daaf1ce9c0e639e7e589d4b7ef-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal\" (UID: \"0cd359daaf1ce9c0e639e7e589d4b7ef\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.474949 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.474752 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb503e0bda4c32012b9b964fc44f2748-config\") pod \"kube-apiserver-proxy-ip-10-0-139-49.ec2.internal\" (UID: \"bb503e0bda4c32012b9b964fc44f2748\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.504916 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.504890 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.605502 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.605463 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.640700 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.640669 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.644273 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.644259 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:20.706337 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.706292 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.806849 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.806766 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.907273 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:20.907238 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:20.974638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.974604 2577 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:30:20.975293 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.974773 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:20.975293 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:20.974781 2577 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:30:21.008032 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:21.008006 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:21.059543 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.059467 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:25:20 +0000 UTC" deadline="2027-10-14 08:40:28.019522267 +0000 UTC" Apr 16 18:30:21.059543 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.059504 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13094h10m6.96002149s" Apr 16 18:30:21.070626 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.070594 2577 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:30:21.091967 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.091939 2577 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:30:21.108929 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:21.108902 2577 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-49.ec2.internal\" not found" Apr 16 18:30:21.111547 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.111528 2577 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-pz2fc" Apr 16 18:30:21.117461 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.117445 2577 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-pz2fc" Apr 16 18:30:21.147947 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.147927 2577 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:21.152192 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:21.152163 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cd359daaf1ce9c0e639e7e589d4b7ef.slice/crio-14cb8339fb77f05aee893e44bbbfd11c3a8dc705db6a284903891f2bb8138915 WatchSource:0}: Error finding container 14cb8339fb77f05aee893e44bbbfd11c3a8dc705db6a284903891f2bb8138915: Status 404 returned error can't find the container with id 14cb8339fb77f05aee893e44bbbfd11c3a8dc705db6a284903891f2bb8138915 Apr 16 18:30:21.152409 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:21.152391 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb503e0bda4c32012b9b964fc44f2748.slice/crio-5b60a30570eef6455caaf32e090d38d0b54177a1ebb064f06dc1e2704ce45ee0 WatchSource:0}: Error finding container 5b60a30570eef6455caaf32e090d38d0b54177a1ebb064f06dc1e2704ce45ee0: Status 404 returned error can't find the container with id 5b60a30570eef6455caaf32e090d38d0b54177a1ebb064f06dc1e2704ce45ee0 Apr 16 18:30:21.157343 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.157330 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:30:21.171628 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.171610 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" Apr 16 18:30:21.183090 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.183075 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:21.185050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.185028 2577 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" Apr 16 18:30:21.193580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.193549 2577 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:30:21.211694 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.211646 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" event={"ID":"0cd359daaf1ce9c0e639e7e589d4b7ef","Type":"ContainerStarted","Data":"14cb8339fb77f05aee893e44bbbfd11c3a8dc705db6a284903891f2bb8138915"} Apr 16 18:30:21.212609 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.212588 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" event={"ID":"bb503e0bda4c32012b9b964fc44f2748","Type":"ContainerStarted","Data":"5b60a30570eef6455caaf32e090d38d0b54177a1ebb064f06dc1e2704ce45ee0"} Apr 16 18:30:21.342994 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:21.342912 2577 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:22.048772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.048744 2577 apiserver.go:52] "Watching apiserver" Apr 16 18:30:22.057302 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.057275 2577 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:30:22.060075 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.060048 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-7rh4p","openshift-cluster-node-tuning-operator/tuned-zggh2","openshift-image-registry/node-ca-z98p5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal","openshift-multus/multus-gp4vl","openshift-multus/network-metrics-daemon-lvnd8","kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86","openshift-dns/node-resolver-bc429","openshift-multus/multus-additional-cni-plugins-lxp9r","openshift-network-diagnostics/network-check-target-z4lx8","openshift-network-operator/iptables-alerter-w25sz","openshift-ovn-kubernetes/ovnkube-node-qg4jj"] Apr 16 18:30:22.061521 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.061501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.064001 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.063976 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.064096 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.064062 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.064335 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.064199 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:22.064335 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.064314 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:30:22.064465 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.064409 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:22.064465 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.064418 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-dr2x5\"" Apr 16 18:30:22.065412 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.065391 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.066275 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.066252 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:30:22.066435 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.066414 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:30:22.066526 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.066487 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2q659\"" Apr 16 18:30:22.066526 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.066507 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:30:22.066789 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.066772 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:30:22.066936 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.066919 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8cksw\"" Apr 16 18:30:22.067023 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.066991 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:30:22.067570 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.067553 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:30:22.067570 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.067567 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:30:22.067689 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.067625 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-sscph\"" Apr 16 18:30:22.067819 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.067805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.067959 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.067942 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:22.068036 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.068007 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:22.069330 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.069284 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.070016 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.069999 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:30:22.070259 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.070168 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:30:22.070259 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.070238 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:30:22.070658 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.070598 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-j78md\"" Apr 16 18:30:22.070658 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.070642 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:30:22.071017 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.070995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.072003 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.071818 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-z7h8v\"" Apr 16 18:30:22.072003 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.071845 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:30:22.072185 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.072011 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:30:22.072185 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.072029 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:30:22.072185 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.072061 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:30:22.072350 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.072295 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:30:22.072350 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.072339 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:30:22.072653 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.072473 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.073213 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.073195 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:30:22.073331 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.073221 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:30:22.073397 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.073355 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:30:22.073476 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.073455 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mvzrq\"" Apr 16 18:30:22.074361 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.074058 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.074775 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.074600 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:30:22.074882 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.074773 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:30:22.074999 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.074984 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-rhvdt\"" Apr 16 18:30:22.075350 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.075321 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:22.075443 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.075383 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:22.076590 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.076570 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:30:22.077210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.077190 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8cgh9\"" Apr 16 18:30:22.077308 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.077285 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:30:22.084382 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084364 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-os-release\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.084483 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8925d01-c41b-4fb9-b191-1828f898b33e-hosts-file\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.084483 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084441 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5b9cf1-84a5-4e49-965d-fca8343e6704-host\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.084483 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvpk\" (UniqueName: \"kubernetes.io/projected/911d663f-58e1-4496-843e-77b3b532f155-kube-api-access-ktvpk\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.084638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084531 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw7qn\" (UniqueName: \"kubernetes.io/projected/ce5b9cf1-84a5-4e49-965d-fca8343e6704-kube-api-access-hw7qn\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.084638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084556 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-run-netns\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.084638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084593 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.084638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084617 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-run\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.084822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-system-cni-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.084822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084670 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-multus-certs\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.084822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-etc-kubernetes\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.084822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084718 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-systemd-units\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.084822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-systemd\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.084822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084766 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.084822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084789 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-ovn\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084851 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-run-ovn-kubernetes\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084891 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-cni-bin\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084913 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-etc-selinux\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084967 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-registration-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.084991 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-device-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085018 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-modprobe-d\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085041 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-cni-multus\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085064 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-cnibin\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.085156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085087 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-socket-dir-parent\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085153 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-var-lib-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085216 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b83210b7-868a-4c77-8395-676d64fdd6ce-iptables-alerter-script\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085252 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-kubernetes\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085280 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-systemd\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085304 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-lib-modules\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085329 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-daemon-config\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085352 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-etc-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-node-log\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085398 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/911d663f-58e1-4496-843e-77b3b532f155-etc-tuned\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085421 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-k8s-cni-cncf-io\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085444 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-netns\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085489 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovnkube-script-lib\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085525 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjnnz\" (UniqueName: \"kubernetes.io/projected/b83210b7-868a-4c77-8395-676d64fdd6ce-kube-api-access-gjnnz\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.085556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085551 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/419f0068-d7e6-4c74-bfcf-1f6ab607c8bb-agent-certs\") pod \"konnectivity-agent-7rh4p\" (UID: \"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb\") " pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-cni-bin\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085600 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-conf-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085675 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-socket-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085715 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ce5b9cf1-84a5-4e49-965d-fca8343e6704-serviceca\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085740 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovnkube-config\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085755 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b83210b7-868a-4c77-8395-676d64fdd6ce-host-slash\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085770 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysctl-conf\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085785 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-host\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085805 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-kubelet\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085884 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-slash\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085948 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqnc\" (UniqueName: \"kubernetes.io/projected/0054f737-6e94-4be3-a7c1-1bf463f73c6c-kube-api-access-8dqnc\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.085978 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm5r7\" (UniqueName: \"kubernetes.io/projected/ac636ecd-0958-44f5-9f1b-68a70f234900-kube-api-access-dm5r7\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086002 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpld5\" (UniqueName: \"kubernetes.io/projected/c8925d01-c41b-4fb9-b191-1828f898b33e-kube-api-access-xpld5\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086029 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysctl-d\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.086136 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-hostroot\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086083 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8925d01-c41b-4fb9-b191-1828f898b33e-tmp-dir\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086111 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-cni-netd\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086134 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovn-node-metrics-cert\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086159 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-sys-fs\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086199 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysconfig\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086224 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-cni-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086270 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2xp7\" (UniqueName: \"kubernetes.io/projected/aaeaee38-a562-498e-b7ec-505134c92159-kube-api-access-j2xp7\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-log-socket\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-var-lib-kubelet\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086381 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/911d663f-58e1-4496-843e-77b3b532f155-tmp\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086414 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbzfk\" (UniqueName: \"kubernetes.io/projected/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-kube-api-access-nbzfk\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086438 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-env-overrides\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086460 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-cni-binary-copy\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086488 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-kubelet\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086512 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/419f0068-d7e6-4c74-bfcf-1f6ab607c8bb-konnectivity-ca\") pod \"konnectivity-agent-7rh4p\" (UID: \"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb\") " pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.086786 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.086552 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-sys\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.119051 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.119015 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:21 +0000 UTC" deadline="2027-12-14 23:26:37.604757272 +0000 UTC" Apr 16 18:30:22.119051 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.119042 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14572h56m15.48571842s" Apr 16 18:30:22.170298 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.170263 2577 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:22.172629 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.172606 2577 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:30:22.186964 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.186939 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqnc\" (UniqueName: \"kubernetes.io/projected/0054f737-6e94-4be3-a7c1-1bf463f73c6c-kube-api-access-8dqnc\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.187118 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.186969 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dm5r7\" (UniqueName: \"kubernetes.io/projected/ac636ecd-0958-44f5-9f1b-68a70f234900-kube-api-access-dm5r7\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.187118 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.186993 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpld5\" (UniqueName: \"kubernetes.io/projected/c8925d01-c41b-4fb9-b191-1828f898b33e-kube-api-access-xpld5\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.187118 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187018 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysctl-d\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.187118 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187049 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.187118 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187078 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd4mm\" (UniqueName: \"kubernetes.io/projected/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-kube-api-access-kd4mm\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.187118 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187107 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-hostroot\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187132 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8925d01-c41b-4fb9-b191-1828f898b33e-tmp-dir\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187159 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-cni-netd\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187184 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-hostroot\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187192 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysctl-d\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187207 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-cni-netd\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovn-node-metrics-cert\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187264 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-sys-fs\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187288 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysconfig\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187348 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysconfig\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-sys-fs\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187424 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:22.187478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187445 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c8925d01-c41b-4fb9-b191-1828f898b33e-tmp-dir\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187494 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-cni-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187527 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2xp7\" (UniqueName: \"kubernetes.io/projected/aaeaee38-a562-498e-b7ec-505134c92159-kube-api-access-j2xp7\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187555 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-log-socket\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187577 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-cni-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187588 2577 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-var-lib-kubelet\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187627 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-log-socket\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187630 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/911d663f-58e1-4496-843e-77b3b532f155-tmp\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cnibin\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187699 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbzfk\" (UniqueName: \"kubernetes.io/projected/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-kube-api-access-nbzfk\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187715 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-var-lib-kubelet\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-env-overrides\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187774 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-system-cni-dir\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187803 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-os-release\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187830 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-cni-binary-copy\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-kubelet\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187902 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/419f0068-d7e6-4c74-bfcf-1f6ab607c8bb-konnectivity-ca\") pod \"konnectivity-agent-7rh4p\" (UID: \"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb\") " pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.188079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187924 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-sys\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187970 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-kubelet\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.187981 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-sys\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188042 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-os-release\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188069 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8925d01-c41b-4fb9-b191-1828f898b33e-hosts-file\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188092 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5b9cf1-84a5-4e49-965d-fca8343e6704-host\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188117 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvpk\" (UniqueName: \"kubernetes.io/projected/911d663f-58e1-4496-843e-77b3b532f155-kube-api-access-ktvpk\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hw7qn\" (UniqueName: \"kubernetes.io/projected/ce5b9cf1-84a5-4e49-965d-fca8343e6704-kube-api-access-hw7qn\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188167 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-run-netns\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188196 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-run\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188268 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-system-cni-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188292 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-multus-certs\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188307 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-env-overrides\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188317 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-etc-kubernetes\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-systemd-units\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188368 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-systemd\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188419 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-cni-binary-copy\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188426 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188464 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-ovn\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188472 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188520 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-run-ovn-kubernetes\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188545 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-cni-bin\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188565 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/419f0068-d7e6-4c74-bfcf-1f6ab607c8bb-konnectivity-ca\") pod \"konnectivity-agent-7rh4p\" (UID: \"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb\") " pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-run\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188617 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-etc-selinux\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-etc-selinux\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188597 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188623 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-os-release\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188645 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-etc-kubernetes\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188666 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c8925d01-c41b-4fb9-b191-1828f898b33e-hosts-file\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188681 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-multus-certs\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188681 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-registration-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.189893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188693 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-run-ovn-kubernetes\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188521 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-run-netns\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188709 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-ovn\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188717 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-device-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188727 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-run-systemd\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188739 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-cni-bin\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-system-cni-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188750 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce5b9cf1-84a5-4e49-965d-fca8343e6704-host\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188790 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-registration-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-systemd-units\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-device-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.188821 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-modprobe-d\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188922 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.188954 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-cni-multus\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.188993 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:22.688946718 +0000 UTC m=+3.073267037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189061 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-modprobe-d\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.190630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189126 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-cnibin\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189151 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-cni-multus\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189154 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-socket-dir-parent\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-socket-dir-parent\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189201 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-var-lib-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189197 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-cnibin\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189231 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b83210b7-868a-4c77-8395-676d64fdd6ce-iptables-alerter-script\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189244 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-var-lib-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-kubernetes\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189269 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-systemd\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189316 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-lib-modules\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-daemon-config\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189670 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-kubernetes\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189683 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-etc-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-node-log\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/911d663f-58e1-4496-843e-77b3b532f155-etc-tuned\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cni-binary-copy\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.191417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189841 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-k8s-cni-cncf-io\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189885 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-etc-openvswitch\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-netns\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189918 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovnkube-script-lib\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.189950 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjnnz\" (UniqueName: \"kubernetes.io/projected/b83210b7-868a-4c77-8395-676d64fdd6ce-kube-api-access-gjnnz\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190101 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/419f0068-d7e6-4c74-bfcf-1f6ab607c8bb-agent-certs\") pod \"konnectivity-agent-7rh4p\" (UID: \"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb\") " pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190145 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-cni-bin\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190162 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-lib-modules\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190178 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-conf-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190226 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-conf-dir\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190259 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-socket-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190336 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ce5b9cf1-84a5-4e49-965d-fca8343e6704-serviceca\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190363 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovnkube-config\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190398 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b83210b7-868a-4c77-8395-676d64fdd6ce-host-slash\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190443 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysctl-conf\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190484 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-host\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-kubelet\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192210 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-slash\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190663 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-host-slash\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190731 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-kubelet-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.190840 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0054f737-6e94-4be3-a7c1-1bf463f73c6c-node-log\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191041 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/b83210b7-868a-4c77-8395-676d64fdd6ce-iptables-alerter-script\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191224 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac636ecd-0958-44f5-9f1b-68a70f234900-socket-dir\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/911d663f-58e1-4496-843e-77b3b532f155-tmp\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191338 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-multus-daemon-config\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191508 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-sysctl-conf\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191662 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/ce5b9cf1-84a5-4e49-965d-fca8343e6704-serviceca\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191749 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-host\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191808 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-kubelet\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191880 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b83210b7-868a-4c77-8395-676d64fdd6ce-host-slash\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.191940 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-var-lib-cni-bin\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.192014 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-k8s-cni-cncf-io\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.192020 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovnkube-config\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.192073 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-host-run-netns\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.192219 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/911d663f-58e1-4496-843e-77b3b532f155-etc-systemd\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.192969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.192233 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovn-node-metrics-cert\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.193714 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.192480 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0054f737-6e94-4be3-a7c1-1bf463f73c6c-ovnkube-script-lib\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.195906 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.195876 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/419f0068-d7e6-4c74-bfcf-1f6ab607c8bb-agent-certs\") pod \"konnectivity-agent-7rh4p\" (UID: \"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb\") " pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.196879 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.196841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqnc\" (UniqueName: \"kubernetes.io/projected/0054f737-6e94-4be3-a7c1-1bf463f73c6c-kube-api-access-8dqnc\") pod \"ovnkube-node-qg4jj\" (UID: \"0054f737-6e94-4be3-a7c1-1bf463f73c6c\") " pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.197141 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.197121 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/911d663f-58e1-4496-843e-77b3b532f155-etc-tuned\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.197464 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.197440 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm5r7\" (UniqueName: \"kubernetes.io/projected/ac636ecd-0958-44f5-9f1b-68a70f234900-kube-api-access-dm5r7\") pod \"aws-ebs-csi-driver-node-bbl86\" (UID: \"ac636ecd-0958-44f5-9f1b-68a70f234900\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.197785 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.197735 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbzfk\" (UniqueName: \"kubernetes.io/projected/59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2-kube-api-access-nbzfk\") pod \"multus-gp4vl\" (UID: \"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2\") " pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.198649 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.198590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2xp7\" (UniqueName: \"kubernetes.io/projected/aaeaee38-a562-498e-b7ec-505134c92159-kube-api-access-j2xp7\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:22.198649 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.198590 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpld5\" (UniqueName: \"kubernetes.io/projected/c8925d01-c41b-4fb9-b191-1828f898b33e-kube-api-access-xpld5\") pod \"node-resolver-bc429\" (UID: \"c8925d01-c41b-4fb9-b191-1828f898b33e\") " pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.198798 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.198738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvpk\" (UniqueName: \"kubernetes.io/projected/911d663f-58e1-4496-843e-77b3b532f155-kube-api-access-ktvpk\") pod \"tuned-zggh2\" (UID: \"911d663f-58e1-4496-843e-77b3b532f155\") " pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.199704 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.199687 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw7qn\" (UniqueName: \"kubernetes.io/projected/ce5b9cf1-84a5-4e49-965d-fca8343e6704-kube-api-access-hw7qn\") pod \"node-ca-z98p5\" (UID: \"ce5b9cf1-84a5-4e49-965d-fca8343e6704\") " pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.200006 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.199985 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjnnz\" (UniqueName: \"kubernetes.io/projected/b83210b7-868a-4c77-8395-676d64fdd6ce-kube-api-access-gjnnz\") pod \"iptables-alerter-w25sz\" (UID: \"b83210b7-868a-4c77-8395-676d64fdd6ce\") " pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.291877 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.291825 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292046 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.291891 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kd4mm\" (UniqueName: \"kubernetes.io/projected/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-kube-api-access-kd4mm\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292046 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.291928 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:22.292046 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.291959 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cnibin\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292046 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.291986 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-system-cni-dir\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292046 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292010 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-os-release\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292329 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292059 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292329 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292071 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cnibin\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292329 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292090 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-system-cni-dir\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292329 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292110 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292329 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292136 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-os-release\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292329 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292142 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cni-binary-copy\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292549 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.292658 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.292637 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.293305 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.293281 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.293445 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.293376 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-cni-binary-copy\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.298470 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.298400 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:22.298584 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.298474 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:22.298584 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.298492 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5nhqx for pod openshift-network-diagnostics/network-check-target-z4lx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:22.298584 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.298575 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx podName:62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:30:22.79855667 +0000 UTC m=+3.182876976 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5nhqx" (UniqueName: "kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx") pod "network-check-target-z4lx8" (UID: "62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:22.300822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.300777 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd4mm\" (UniqueName: \"kubernetes.io/projected/f7e8daeb-176b-4474-94a0-f3d73d0cdf36-kube-api-access-kd4mm\") pod \"multus-additional-cni-plugins-lxp9r\" (UID: \"f7e8daeb-176b-4474-94a0-f3d73d0cdf36\") " pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.376212 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.376176 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-w25sz" Apr 16 18:30:22.385008 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.384981 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zggh2" Apr 16 18:30:22.392670 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.392643 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-z98p5" Apr 16 18:30:22.399251 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.399228 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:22.406109 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.406072 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gp4vl" Apr 16 18:30:22.412689 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.412672 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:22.420264 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.420244 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" Apr 16 18:30:22.427756 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.427736 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bc429" Apr 16 18:30:22.433927 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.433907 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" Apr 16 18:30:22.558553 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.558477 2577 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:30:22.695423 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.695385 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:22.695674 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.695529 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:22.695674 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.695595 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:23.695577999 +0000 UTC m=+4.079898297 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:22.746928 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:22.746855 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac636ecd_0958_44f5_9f1b_68a70f234900.slice/crio-cd3ff0eeb79cc42114f41d6f45c09fd76b5242dbfae690c9174cd7dc900b733a WatchSource:0}: Error finding container cd3ff0eeb79cc42114f41d6f45c09fd76b5242dbfae690c9174cd7dc900b733a: Status 404 returned error can't find the container with id cd3ff0eeb79cc42114f41d6f45c09fd76b5242dbfae690c9174cd7dc900b733a Apr 16 18:30:22.748504 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:22.748475 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce5b9cf1_84a5_4e49_965d_fca8343e6704.slice/crio-8fe83afdf25c3697c36b3212a4755b94b507f16a73a052679b817da54fd039c3 WatchSource:0}: Error finding container 8fe83afdf25c3697c36b3212a4755b94b507f16a73a052679b817da54fd039c3: Status 404 returned error can't find the container with id 8fe83afdf25c3697c36b3212a4755b94b507f16a73a052679b817da54fd039c3 Apr 16 18:30:22.751490 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:22.751471 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb83210b7_868a_4c77_8395_676d64fdd6ce.slice/crio-ee3da2911e7f5ce4172cde3f2ea8d3080aad39e80317906669d7da43787ff622 WatchSource:0}: Error finding container ee3da2911e7f5ce4172cde3f2ea8d3080aad39e80317906669d7da43787ff622: Status 404 returned error can't find the container with id ee3da2911e7f5ce4172cde3f2ea8d3080aad39e80317906669d7da43787ff622 Apr 16 18:30:22.752558 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:22.752539 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59c1c7ac_09cf_42a7_8c82_4ce36cbb0ac2.slice/crio-b798b74c18ba15b6881b9ca1a2e7260cd088287b0312c6201170d9380c99ed25 WatchSource:0}: Error finding container b798b74c18ba15b6881b9ca1a2e7260cd088287b0312c6201170d9380c99ed25: Status 404 returned error can't find the container with id b798b74c18ba15b6881b9ca1a2e7260cd088287b0312c6201170d9380c99ed25 Apr 16 18:30:22.753648 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:22.753583 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911d663f_58e1_4496_843e_77b3b532f155.slice/crio-318485572c872270a687744f12a1afb50c017748b9a6b76010faa9605291175d WatchSource:0}: Error finding container 318485572c872270a687744f12a1afb50c017748b9a6b76010faa9605291175d: Status 404 returned error can't find the container with id 318485572c872270a687744f12a1afb50c017748b9a6b76010faa9605291175d Apr 16 18:30:22.754651 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:22.754630 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0054f737_6e94_4be3_a7c1_1bf463f73c6c.slice/crio-deb325ea2406409b2aa2a5bc68d06178f836784442b4e6c359bb99d0c54d4c3f WatchSource:0}: Error finding container deb325ea2406409b2aa2a5bc68d06178f836784442b4e6c359bb99d0c54d4c3f: Status 404 returned error can't find the container with id deb325ea2406409b2aa2a5bc68d06178f836784442b4e6c359bb99d0c54d4c3f Apr 16 18:30:22.897113 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:22.897078 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:22.897263 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.897200 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:22.897263 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.897213 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:22.897263 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.897222 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5nhqx for pod openshift-network-diagnostics/network-check-target-z4lx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:22.897374 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:22.897268 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx podName:62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:30:23.897255618 +0000 UTC m=+4.281575920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5nhqx" (UniqueName: "kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx") pod "network-check-target-z4lx8" (UID: "62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:23.119241 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.119156 2577 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:25:21 +0000 UTC" deadline="2027-12-17 05:51:49.509812329 +0000 UTC" Apr 16 18:30:23.119241 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.119196 2577 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14627h21m26.390619969s" Apr 16 18:30:23.226354 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.226281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7rh4p" event={"ID":"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb","Type":"ContainerStarted","Data":"57f454fcf7906f8fbdab600ebd989955afb64fe814a502879d8bb55c3023b526"} Apr 16 18:30:23.229810 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.229749 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"deb325ea2406409b2aa2a5bc68d06178f836784442b4e6c359bb99d0c54d4c3f"} Apr 16 18:30:23.239120 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.236754 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gp4vl" event={"ID":"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2","Type":"ContainerStarted","Data":"b798b74c18ba15b6881b9ca1a2e7260cd088287b0312c6201170d9380c99ed25"} Apr 16 18:30:23.242192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.242125 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w25sz" event={"ID":"b83210b7-868a-4c77-8395-676d64fdd6ce","Type":"ContainerStarted","Data":"ee3da2911e7f5ce4172cde3f2ea8d3080aad39e80317906669d7da43787ff622"} Apr 16 18:30:23.249661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.249625 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z98p5" event={"ID":"ce5b9cf1-84a5-4e49-965d-fca8343e6704","Type":"ContainerStarted","Data":"8fe83afdf25c3697c36b3212a4755b94b507f16a73a052679b817da54fd039c3"} Apr 16 18:30:23.254013 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.253281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" event={"ID":"bb503e0bda4c32012b9b964fc44f2748","Type":"ContainerStarted","Data":"f5515aeff27b7ad5fd3fe54532b2224869a69a69703cda9b2f4a9bc3cb2b1834"} Apr 16 18:30:23.266205 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.266133 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerStarted","Data":"adb41dbd7498d7949ad1166bd833cdfc9bc23f3e31516acfa2d0948ae89694f6"} Apr 16 18:30:23.268372 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.268323 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bc429" event={"ID":"c8925d01-c41b-4fb9-b191-1828f898b33e","Type":"ContainerStarted","Data":"c802baa3f3a7226e6af0be1412833f15f08a2df11a759428934412093e26fd81"} Apr 16 18:30:23.279652 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.279616 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zggh2" event={"ID":"911d663f-58e1-4496-843e-77b3b532f155","Type":"ContainerStarted","Data":"318485572c872270a687744f12a1afb50c017748b9a6b76010faa9605291175d"} Apr 16 18:30:23.284306 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.284034 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" event={"ID":"ac636ecd-0958-44f5-9f1b-68a70f234900","Type":"ContainerStarted","Data":"cd3ff0eeb79cc42114f41d6f45c09fd76b5242dbfae690c9174cd7dc900b733a"} Apr 16 18:30:23.705188 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.704582 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:23.705188 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:23.704733 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:23.705188 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:23.704809 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:25.704788647 +0000 UTC m=+6.089108960 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:23.906756 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:23.906719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:23.906973 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:23.906946 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:23.906973 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:23.906970 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:23.907104 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:23.906983 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5nhqx for pod openshift-network-diagnostics/network-check-target-z4lx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:23.907104 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:23.907040 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx podName:62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:30:25.907021077 +0000 UTC m=+6.291341378 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5nhqx" (UniqueName: "kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx") pod "network-check-target-z4lx8" (UID: "62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:24.000153 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.000096 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-49.ec2.internal" podStartSLOduration=3.000079042 podStartE2EDuration="3.000079042s" podCreationTimestamp="2026-04-16 18:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:23.270351068 +0000 UTC m=+3.654671393" watchObservedRunningTime="2026-04-16 18:30:24.000079042 +0000 UTC m=+4.384399360" Apr 16 18:30:24.000337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.000228 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-gsxc5"] Apr 16 18:30:24.002135 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.002111 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.002266 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:24.002192 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:24.108089 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.108007 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.108089 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.108084 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9199c2f9-247b-4dc9-9900-9a6c99aac450-dbus\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.108315 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.108110 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9199c2f9-247b-4dc9-9900-9a6c99aac450-kubelet-config\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.210040 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.210011 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:24.210498 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:24.210142 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:24.210605 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.210585 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:24.210714 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:24.210694 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:24.210982 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.210961 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9199c2f9-247b-4dc9-9900-9a6c99aac450-dbus\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.211064 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.211000 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9199c2f9-247b-4dc9-9900-9a6c99aac450-kubelet-config\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.211113 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.211061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.211195 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:24.211179 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:24.211248 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:24.211237 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret podName:9199c2f9-247b-4dc9-9900-9a6c99aac450 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:24.711219502 +0000 UTC m=+5.095539806 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret") pod "global-pull-secret-syncer-gsxc5" (UID: "9199c2f9-247b-4dc9-9900-9a6c99aac450") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:24.211395 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.211378 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/9199c2f9-247b-4dc9-9900-9a6c99aac450-dbus\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.211452 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.211442 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/9199c2f9-247b-4dc9-9900-9a6c99aac450-kubelet-config\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.313890 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.313521 2577 generic.go:358] "Generic (PLEG): container finished" podID="0cd359daaf1ce9c0e639e7e589d4b7ef" containerID="fd4fd580ce8e349fab27a57002951e7c4351be972700eda534d6810cff092e15" exitCode=0 Apr 16 18:30:24.314909 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.314818 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" event={"ID":"0cd359daaf1ce9c0e639e7e589d4b7ef","Type":"ContainerDied","Data":"fd4fd580ce8e349fab27a57002951e7c4351be972700eda534d6810cff092e15"} Apr 16 18:30:24.715191 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:24.715131 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:24.715380 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:24.715338 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:24.715516 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:24.715500 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret podName:9199c2f9-247b-4dc9-9900-9a6c99aac450 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:25.715442951 +0000 UTC m=+6.099763273 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret") pod "global-pull-secret-syncer-gsxc5" (UID: "9199c2f9-247b-4dc9-9900-9a6c99aac450") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:25.209326 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:25.208826 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:25.209326 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.208975 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:25.320113 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:25.320076 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" event={"ID":"0cd359daaf1ce9c0e639e7e589d4b7ef","Type":"ContainerStarted","Data":"fe0bd59b7ac90c0401c14f163355c8e64a523c7ee1f3f27774631c25136d9ca3"} Apr 16 18:30:25.721828 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:25.721758 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:25.721828 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:25.721823 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:25.722069 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.721994 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:25.722069 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.722060 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret podName:9199c2f9-247b-4dc9-9900-9a6c99aac450 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:27.722039756 +0000 UTC m=+8.106360075 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret") pod "global-pull-secret-syncer-gsxc5" (UID: "9199c2f9-247b-4dc9-9900-9a6c99aac450") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:25.722182 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.722156 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:25.722229 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.722192 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:29.72218027 +0000 UTC m=+10.106500571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:25.924171 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:25.923873 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:25.924171 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.924059 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:25.924171 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.924078 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:25.924171 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.924091 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5nhqx for pod openshift-network-diagnostics/network-check-target-z4lx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:25.924171 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:25.924155 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx podName:62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:30:29.924134865 +0000 UTC m=+10.308455170 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5nhqx" (UniqueName: "kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx") pod "network-check-target-z4lx8" (UID: "62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:26.209526 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:26.208982 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:26.209526 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:26.209112 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:26.209983 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:26.209809 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:26.209983 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:26.209925 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:27.209152 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:27.209116 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:27.209644 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:27.209261 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:27.741715 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:27.741671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:27.741947 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:27.741816 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:27.741947 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:27.741898 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret podName:9199c2f9-247b-4dc9-9900-9a6c99aac450 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:31.74187796 +0000 UTC m=+12.126198277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret") pod "global-pull-secret-syncer-gsxc5" (UID: "9199c2f9-247b-4dc9-9900-9a6c99aac450") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:28.208939 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:28.208699 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:28.208939 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:28.208827 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:28.209249 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:28.209226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:28.209629 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:28.209333 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:29.208564 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:29.208531 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:29.208754 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:29.208653 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:29.761040 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:29.760998 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:29.761479 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:29.761158 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:29.761479 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:29.761234 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:37.761213081 +0000 UTC m=+18.145533397 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:29.962750 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:29.962627 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:29.962928 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:29.962813 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:29.962928 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:29.962837 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:29.962928 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:29.962849 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5nhqx for pod openshift-network-diagnostics/network-check-target-z4lx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:29.962928 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:29.962922 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx podName:62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:30:37.962901272 +0000 UTC m=+18.347221585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5nhqx" (UniqueName: "kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx") pod "network-check-target-z4lx8" (UID: "62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:30.210556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:30.210073 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:30.210556 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:30.210200 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:30.210556 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:30.210315 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:30.210556 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:30.210435 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:31.209041 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:31.209003 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:31.209508 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:31.209143 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:31.777730 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:31.777690 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:31.777942 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:31.777833 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:31.777942 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:31.777910 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret podName:9199c2f9-247b-4dc9-9900-9a6c99aac450 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:39.777894449 +0000 UTC m=+20.162214772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret") pod "global-pull-secret-syncer-gsxc5" (UID: "9199c2f9-247b-4dc9-9900-9a6c99aac450") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:32.209231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:32.209199 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:32.209628 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:32.209209 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:32.209628 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:32.209315 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:32.209628 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:32.209362 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:33.208822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:33.208789 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:33.208996 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:33.208916 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:34.209279 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:34.209242 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:34.209723 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:34.209263 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:34.209723 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:34.209379 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:34.209723 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:34.209458 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:35.209031 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:35.208995 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:35.209212 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:35.209121 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:36.208638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:36.208587 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:36.208638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:36.208605 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:36.209138 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:36.208736 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:36.209138 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:36.208869 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:37.208593 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:37.208555 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:37.208771 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:37.208691 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:37.819741 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:37.819703 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:37.819940 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:37.819886 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:37.820035 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:37.819969 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:53.819946983 +0000 UTC m=+34.204267294 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:38.021226 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:38.021189 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:38.021423 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:38.021350 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:38.021423 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:38.021370 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:38.021423 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:38.021384 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5nhqx for pod openshift-network-diagnostics/network-check-target-z4lx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:38.021575 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:38.021446 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx podName:62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:30:54.021431735 +0000 UTC m=+34.405752038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5nhqx" (UniqueName: "kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx") pod "network-check-target-z4lx8" (UID: "62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:38.208901 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:38.208814 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:38.208901 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:38.208816 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:38.209450 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:38.208965 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:38.209450 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:38.209035 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:39.209060 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:39.209037 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:39.209406 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:39.209136 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:39.834108 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:39.834046 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:39.834290 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:39.834234 2577 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:39.834327 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:39.834297 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret podName:9199c2f9-247b-4dc9-9900-9a6c99aac450 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:55.834278115 +0000 UTC m=+36.218598438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret") pod "global-pull-secret-syncer-gsxc5" (UID: "9199c2f9-247b-4dc9-9900-9a6c99aac450") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:30:40.209558 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.209343 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:40.210218 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.209419 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:40.210218 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:40.209671 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:40.210218 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:40.209747 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:40.348007 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.347853 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:30:40.348337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.348312 2577 generic.go:358] "Generic (PLEG): container finished" podID="0054f737-6e94-4be3-a7c1-1bf463f73c6c" containerID="41b1a5d7dde99e11b53a6874051989b02726b4e4e756fc6d834a1e61a65ee78a" exitCode=1 Apr 16 18:30:40.348421 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.348380 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"5f71308e0a5cd1a66e7f78bd6f7aec3b6d0d7baa6e79adcb80a82dd533032769"} Apr 16 18:30:40.348421 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.348409 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerDied","Data":"41b1a5d7dde99e11b53a6874051989b02726b4e4e756fc6d834a1e61a65ee78a"} Apr 16 18:30:40.348497 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.348423 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"783a01f59cb5ff503327e1622726145f50bf6fa737da199ba6c1ec99dd9ca31d"} Apr 16 18:30:40.349573 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.349515 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gp4vl" event={"ID":"59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2","Type":"ContainerStarted","Data":"72c4f24b8893cda275e6f3edbf581eaa5af5e283ea7ee98b3cac53b1f66d43a3"} Apr 16 18:30:40.350723 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.350701 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-z98p5" event={"ID":"ce5b9cf1-84a5-4e49-965d-fca8343e6704","Type":"ContainerStarted","Data":"b377eda84e54c4e08c238c81ffa153146703e6495a15b724c28f0d7bd45a5019"} Apr 16 18:30:40.351961 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.351938 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7e8daeb-176b-4474-94a0-f3d73d0cdf36" containerID="ea8d73f9af2e4e3906c7611224698fd98bc3c3bfa60606ff285b9f5e43ea3e6c" exitCode=0 Apr 16 18:30:40.352045 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.351976 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerDied","Data":"ea8d73f9af2e4e3906c7611224698fd98bc3c3bfa60606ff285b9f5e43ea3e6c"} Apr 16 18:30:40.353377 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.353357 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bc429" event={"ID":"c8925d01-c41b-4fb9-b191-1828f898b33e","Type":"ContainerStarted","Data":"579f52af4de767feb4915924e03a380635040c7373892ac9933c775ad2644dc9"} Apr 16 18:30:40.354576 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.354560 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zggh2" event={"ID":"911d663f-58e1-4496-843e-77b3b532f155","Type":"ContainerStarted","Data":"67486f1302ae82b95e7cfec412610229e4202131010e2a800e6e503d828e9f12"} Apr 16 18:30:40.355723 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.355707 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" event={"ID":"ac636ecd-0958-44f5-9f1b-68a70f234900","Type":"ContainerStarted","Data":"780285b0c3d8fa4ba2d62d0ed99364c73196a34328550831c30266ad39b37112"} Apr 16 18:30:40.356838 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.356817 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-7rh4p" event={"ID":"419f0068-d7e6-4c74-bfcf-1f6ab607c8bb","Type":"ContainerStarted","Data":"7fbb8c3617a71a48835a5b96a2bd972cdbed85de5871fa2543cec9bc63fc9119"} Apr 16 18:30:40.367386 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.367357 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-49.ec2.internal" podStartSLOduration=19.367347269 podStartE2EDuration="19.367347269s" podCreationTimestamp="2026-04-16 18:30:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:30:25.335901228 +0000 UTC m=+5.720221555" watchObservedRunningTime="2026-04-16 18:30:40.367347269 +0000 UTC m=+20.751667591" Apr 16 18:30:40.367451 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.367415 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gp4vl" podStartSLOduration=3.372470015 podStartE2EDuration="20.367410726s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.754834239 +0000 UTC m=+3.139154541" lastFinishedPulling="2026-04-16 18:30:39.74977495 +0000 UTC m=+20.134095252" observedRunningTime="2026-04-16 18:30:40.36613174 +0000 UTC m=+20.750452074" watchObservedRunningTime="2026-04-16 18:30:40.367410726 +0000 UTC m=+20.751731063" Apr 16 18:30:40.381672 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.381638 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zggh2" podStartSLOduration=3.453072482 podStartE2EDuration="20.381626444s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.756140914 +0000 UTC m=+3.140461229" lastFinishedPulling="2026-04-16 18:30:39.684694879 +0000 UTC m=+20.069015191" observedRunningTime="2026-04-16 18:30:40.381365445 +0000 UTC m=+20.765685766" watchObservedRunningTime="2026-04-16 18:30:40.381626444 +0000 UTC m=+20.765946766" Apr 16 18:30:40.395781 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.395732 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bc429" podStartSLOduration=3.418370562 podStartE2EDuration="20.395716964s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.758388661 +0000 UTC m=+3.142708961" lastFinishedPulling="2026-04-16 18:30:39.735735058 +0000 UTC m=+20.120055363" observedRunningTime="2026-04-16 18:30:40.395013243 +0000 UTC m=+20.779333564" watchObservedRunningTime="2026-04-16 18:30:40.395716964 +0000 UTC m=+20.780037285" Apr 16 18:30:40.431357 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.431311 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-z98p5" podStartSLOduration=3.497064431 podStartE2EDuration="20.431296381s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.750438379 +0000 UTC m=+3.134758683" lastFinishedPulling="2026-04-16 18:30:39.684670328 +0000 UTC m=+20.068990633" observedRunningTime="2026-04-16 18:30:40.430922663 +0000 UTC m=+20.815242983" watchObservedRunningTime="2026-04-16 18:30:40.431296381 +0000 UTC m=+20.815616705" Apr 16 18:30:40.452331 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.452282 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-7rh4p" podStartSLOduration=3.526287087 podStartE2EDuration="20.452268079s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.758711634 +0000 UTC m=+3.143031937" lastFinishedPulling="2026-04-16 18:30:39.684692629 +0000 UTC m=+20.069012929" observedRunningTime="2026-04-16 18:30:40.45180573 +0000 UTC m=+20.836126051" watchObservedRunningTime="2026-04-16 18:30:40.452268079 +0000 UTC m=+20.836588401" Apr 16 18:30:40.908290 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.908208 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:40.908848 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:40.908832 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:41.209084 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.209059 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:41.209209 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:41.209180 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:41.365078 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.365057 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:30:41.365454 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.365401 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"0785df7d248b84f891bd80232bb31503fdc227c57aae65f01af7c2129be5c168"} Apr 16 18:30:41.365454 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.365435 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"57dc1376ac9e560a8253f7d6681244a887c97029c5dc5c64a7ebffdf2fd1c151"} Apr 16 18:30:41.365454 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.365445 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"5aa3b263cfd1fec6002afebfb3ae275318181d8336c2c3a57870cfc5134cec49"} Apr 16 18:30:41.366879 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.366836 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-w25sz" event={"ID":"b83210b7-868a-4c77-8395-676d64fdd6ce","Type":"ContainerStarted","Data":"7849d57ea2765a4dae56617fe7dcdf70d7fb073c18b0828f20be1cabf9729ec5"} Apr 16 18:30:41.367882 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.367848 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:41.367990 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.367936 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-7rh4p" Apr 16 18:30:41.382298 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.382253 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-w25sz" podStartSLOduration=4.396673328 podStartE2EDuration="21.382241409s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.753219661 +0000 UTC m=+3.137539963" lastFinishedPulling="2026-04-16 18:30:39.738787729 +0000 UTC m=+20.123108044" observedRunningTime="2026-04-16 18:30:41.381772132 +0000 UTC m=+21.766092455" watchObservedRunningTime="2026-04-16 18:30:41.382241409 +0000 UTC m=+21.766561729" Apr 16 18:30:41.420848 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:41.420816 2577 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:30:42.139580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:42.139473 2577 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:30:41.420842863Z","UUID":"cbfcd9a8-b5ba-472b-b118-bb892f7f1324","Handler":null,"Name":"","Endpoint":""} Apr 16 18:30:42.143013 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:42.142990 2577 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:30:42.143142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:42.143021 2577 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:30:42.208905 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:42.208657 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:42.209061 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:42.208658 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:42.209061 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:42.209003 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:42.209191 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:42.209120 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:42.370624 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:42.370591 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" event={"ID":"ac636ecd-0958-44f5-9f1b-68a70f234900","Type":"ContainerStarted","Data":"4c62d6d4644fd1d9c38430810dfecc36a8015cf2bd47eec955b5cf5a0b8f941e"} Apr 16 18:30:43.208641 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:43.208615 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:43.208813 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:43.208708 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:43.374818 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:43.374780 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" event={"ID":"ac636ecd-0958-44f5-9f1b-68a70f234900","Type":"ContainerStarted","Data":"e57052a4e631bea8d9cef16a1fcb10b843984da15718ed3ff82123143e52fa30"} Apr 16 18:30:43.377913 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:43.377847 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:30:43.378279 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:43.378258 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"364a9e588cbebd3ce3214d05e8a3457b5fa95f75fcd842739f67001034619458"} Apr 16 18:30:43.391625 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:43.391568 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-bbl86" podStartSLOduration=3.506962841 podStartE2EDuration="23.391554281s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.748911825 +0000 UTC m=+3.133232126" lastFinishedPulling="2026-04-16 18:30:42.633503253 +0000 UTC m=+23.017823566" observedRunningTime="2026-04-16 18:30:43.390934777 +0000 UTC m=+23.775255089" watchObservedRunningTime="2026-04-16 18:30:43.391554281 +0000 UTC m=+23.775874601" Apr 16 18:30:44.208539 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:44.208507 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:44.208693 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:44.208626 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:44.208693 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:44.208681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:44.208837 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:44.208815 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:45.208960 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.208776 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:45.209532 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:45.209030 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:45.383484 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.383451 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7e8daeb-176b-4474-94a0-f3d73d0cdf36" containerID="2048d127b345e61c66dff0f60f6d24d2f170f68a93f404dd6e15a86fccd647b8" exitCode=0 Apr 16 18:30:45.383615 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.383534 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerDied","Data":"2048d127b345e61c66dff0f60f6d24d2f170f68a93f404dd6e15a86fccd647b8"} Apr 16 18:30:45.386651 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.386635 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:30:45.386995 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.386971 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"587ab203f4976fe0b1b2c94cf396672040c6a983f39a7c106307ac6420938d2d"} Apr 16 18:30:45.387292 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.387274 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:45.387351 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.387306 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:45.387438 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.387425 2577 scope.go:117] "RemoveContainer" containerID="41b1a5d7dde99e11b53a6874051989b02726b4e4e756fc6d834a1e61a65ee78a" Apr 16 18:30:45.402963 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:45.402937 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:46.208445 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.208373 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:46.208445 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.208403 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:46.208546 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:46.208479 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:46.208625 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:46.208601 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:46.392662 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.392580 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:30:46.393105 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.392951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" event={"ID":"0054f737-6e94-4be3-a7c1-1bf463f73c6c","Type":"ContainerStarted","Data":"f03524a33321279277c7505654d37c424bf46a42565fef9ab0beeb8fa3f5e9ab"} Apr 16 18:30:46.393413 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.393376 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:46.395010 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.394978 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7e8daeb-176b-4474-94a0-f3d73d0cdf36" containerID="95b8ccadfd9050c3a5833f482194fdae9d53d3927eeab1455bc17429866208d3" exitCode=0 Apr 16 18:30:46.395093 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.395042 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerDied","Data":"95b8ccadfd9050c3a5833f482194fdae9d53d3927eeab1455bc17429866208d3"} Apr 16 18:30:46.408254 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.408238 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:30:46.420407 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.420371 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" podStartSLOduration=9.393343312 podStartE2EDuration="26.420360762s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.756505387 +0000 UTC m=+3.140825700" lastFinishedPulling="2026-04-16 18:30:39.783522834 +0000 UTC m=+20.167843150" observedRunningTime="2026-04-16 18:30:46.419977163 +0000 UTC m=+26.804297484" watchObservedRunningTime="2026-04-16 18:30:46.420360762 +0000 UTC m=+26.804681083" Apr 16 18:30:46.847963 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.847165 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gsxc5"] Apr 16 18:30:46.847963 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.847518 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:46.847963 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:46.847626 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:46.848337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.847990 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lvnd8"] Apr 16 18:30:46.848337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.848071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:46.848337 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:46.848173 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:46.848687 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.848660 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z4lx8"] Apr 16 18:30:46.848787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:46.848747 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:46.848873 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:46.848831 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:47.398541 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:47.398448 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7e8daeb-176b-4474-94a0-f3d73d0cdf36" containerID="e569831507f0e82efe5c88a89be58aa4b8ab4e37c35e5544dcea89e3b8dfc81d" exitCode=0 Apr 16 18:30:47.398541 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:47.398501 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerDied","Data":"e569831507f0e82efe5c88a89be58aa4b8ab4e37c35e5544dcea89e3b8dfc81d"} Apr 16 18:30:48.208911 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:48.208877 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:48.209132 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:48.208878 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:48.209132 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:48.209027 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:48.209132 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:48.209090 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:49.209184 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:49.209156 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:49.209652 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:49.209263 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:50.210234 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:50.210201 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:50.210234 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:50.210238 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:50.210938 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:50.210313 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:50.210938 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:50.210462 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:51.208868 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:51.208836 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:51.209044 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:51.208977 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:52.209255 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:52.209226 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:52.209763 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:52.209234 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:52.209763 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:52.209363 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:30:52.209763 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:52.209410 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-z4lx8" podUID="62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee" Apr 16 18:30:53.208381 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.208357 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:53.208499 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:53.208476 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-gsxc5" podUID="9199c2f9-247b-4dc9-9900-9a6c99aac450" Apr 16 18:30:53.412005 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.411969 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerStarted","Data":"13a3df3f09d184dbc3a4179079fb93150da226a3942f8b1be902d60cacb777e1"} Apr 16 18:30:53.841418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.841390 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:53.841580 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:53.841520 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:53.841580 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:53.841580 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:25.841561447 +0000 UTC m=+66.225881755 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:30:53.917638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.917618 2577 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-49.ec2.internal" event="NodeReady" Apr 16 18:30:53.917772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.917729 2577 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:30:53.970497 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.970461 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-rp9jr"] Apr 16 18:30:53.984944 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.984918 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-f9b8k"] Apr 16 18:30:53.985106 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.985071 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:53.988062 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.987639 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tdfb9\"" Apr 16 18:30:53.988062 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.987827 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:30:53.988062 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:53.988055 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:30:54.020847 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.020766 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f9b8k"] Apr 16 18:30:54.020969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.020850 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:54.020969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.020878 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rp9jr"] Apr 16 18:30:54.023377 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.023359 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:30:54.023485 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.023385 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:30:54.023485 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.023397 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6c796\"" Apr 16 18:30:54.023485 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.023397 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:30:54.043162 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.043141 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:54.043278 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.043262 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:30:54.043339 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.043284 2577 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:30:54.043339 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.043297 2577 projected.go:194] Error preparing data for projected volume kube-api-access-5nhqx for pod openshift-network-diagnostics/network-check-target-z4lx8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:54.043441 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.043357 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx podName:62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee nodeName:}" failed. No retries permitted until 2026-04-16 18:31:26.043338349 +0000 UTC m=+66.427658662 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-5nhqx" (UniqueName: "kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx") pod "network-check-target-z4lx8" (UID: "62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:30:54.143706 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.143655 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83f97e63-c527-4a65-9e6d-3bb32971b8d9-config-volume\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.143706 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.143683 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x97sk\" (UniqueName: \"kubernetes.io/projected/83f97e63-c527-4a65-9e6d-3bb32971b8d9-kube-api-access-x97sk\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.143706 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.143702 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83f97e63-c527-4a65-9e6d-3bb32971b8d9-tmp-dir\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.143841 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.143730 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5xst\" (UniqueName: \"kubernetes.io/projected/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-kube-api-access-s5xst\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:54.143841 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.143831 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.143936 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.143889 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:54.209050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.209028 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:30:54.209124 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.209057 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:30:54.212064 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.212044 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:30:54.212180 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.212100 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:30:54.212180 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.212047 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:30:54.212367 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.212215 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kxnrw\"" Apr 16 18:30:54.212367 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.212334 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7lgq2\"" Apr 16 18:30:54.244929 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.244907 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5xst\" (UniqueName: \"kubernetes.io/projected/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-kube-api-access-s5xst\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:54.245049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.244963 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.245049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.244995 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:54.245049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.245043 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83f97e63-c527-4a65-9e6d-3bb32971b8d9-config-volume\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.245187 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.245053 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:54.245187 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.245070 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x97sk\" (UniqueName: \"kubernetes.io/projected/83f97e63-c527-4a65-9e6d-3bb32971b8d9-kube-api-access-x97sk\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.245187 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.245080 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:54.245187 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.245096 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83f97e63-c527-4a65-9e6d-3bb32971b8d9-tmp-dir\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.245187 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.245113 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:54.745093548 +0000 UTC m=+35.129413849 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:30:54.245187 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.245141 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:54.745124892 +0000 UTC m=+35.129445195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:30:54.245474 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.245386 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83f97e63-c527-4a65-9e6d-3bb32971b8d9-tmp-dir\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.245714 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.245695 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/83f97e63-c527-4a65-9e6d-3bb32971b8d9-config-volume\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.255353 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.255327 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x97sk\" (UniqueName: \"kubernetes.io/projected/83f97e63-c527-4a65-9e6d-3bb32971b8d9-kube-api-access-x97sk\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.255431 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.255391 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5xst\" (UniqueName: \"kubernetes.io/projected/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-kube-api-access-s5xst\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:54.415751 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.415685 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7e8daeb-176b-4474-94a0-f3d73d0cdf36" containerID="13a3df3f09d184dbc3a4179079fb93150da226a3942f8b1be902d60cacb777e1" exitCode=0 Apr 16 18:30:54.415751 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.415733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerDied","Data":"13a3df3f09d184dbc3a4179079fb93150da226a3942f8b1be902d60cacb777e1"} Apr 16 18:30:54.749490 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.749232 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:54.749490 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:54.749441 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:54.749490 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.749377 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:54.749699 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.749514 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:54.749699 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.749558 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:55.749540686 +0000 UTC m=+36.133860989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:30:54.749699 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:54.749573 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:55.749567151 +0000 UTC m=+36.133887449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:30:55.208336 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.208308 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:55.211154 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.211138 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:30:55.419498 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.419462 2577 generic.go:358] "Generic (PLEG): container finished" podID="f7e8daeb-176b-4474-94a0-f3d73d0cdf36" containerID="bcdb145fdbb3b785c9020092a38ef69ae65738c4951274bfd3eccfb6565ca720" exitCode=0 Apr 16 18:30:55.419832 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.419535 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerDied","Data":"bcdb145fdbb3b785c9020092a38ef69ae65738c4951274bfd3eccfb6565ca720"} Apr 16 18:30:55.755982 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.755911 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:55.755982 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.755951 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:55.756144 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:55.756057 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:55.756144 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:55.756111 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:57.756097101 +0000 UTC m=+38.140417404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:30:55.756144 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:55.756057 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:55.756265 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:55.756172 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:30:57.756161484 +0000 UTC m=+38.140481788 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:30:55.856458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.856423 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:55.863610 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:55.863586 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/9199c2f9-247b-4dc9-9900-9a6c99aac450-original-pull-secret\") pod \"global-pull-secret-syncer-gsxc5\" (UID: \"9199c2f9-247b-4dc9-9900-9a6c99aac450\") " pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:56.127274 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:56.127242 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-gsxc5" Apr 16 18:30:56.289102 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:56.289073 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-gsxc5"] Apr 16 18:30:56.294085 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:30:56.293943 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9199c2f9_247b_4dc9_9900_9a6c99aac450.slice/crio-ba13e505b97f5000e24af393d8dfbdddaa6457e1d9cb490b1f46275f0fe2e504 WatchSource:0}: Error finding container ba13e505b97f5000e24af393d8dfbdddaa6457e1d9cb490b1f46275f0fe2e504: Status 404 returned error can't find the container with id ba13e505b97f5000e24af393d8dfbdddaa6457e1d9cb490b1f46275f0fe2e504 Apr 16 18:30:56.423876 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:56.423837 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" event={"ID":"f7e8daeb-176b-4474-94a0-f3d73d0cdf36","Type":"ContainerStarted","Data":"88ba090b33e4d0f87cf3b97737150cf765c25dc590282cf240dfbc10f912f1f4"} Apr 16 18:30:56.424912 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:56.424885 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gsxc5" event={"ID":"9199c2f9-247b-4dc9-9900-9a6c99aac450","Type":"ContainerStarted","Data":"ba13e505b97f5000e24af393d8dfbdddaa6457e1d9cb490b1f46275f0fe2e504"} Apr 16 18:30:56.450492 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:56.450452 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lxp9r" podStartSLOduration=6.041609899 podStartE2EDuration="36.450440091s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:30:22.761980822 +0000 UTC m=+3.146301135" lastFinishedPulling="2026-04-16 18:30:53.170811024 +0000 UTC m=+33.555131327" observedRunningTime="2026-04-16 18:30:56.448814986 +0000 UTC m=+36.833135306" watchObservedRunningTime="2026-04-16 18:30:56.450440091 +0000 UTC m=+36.834760412" Apr 16 18:30:57.771613 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:57.771578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:30:57.772089 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:30:57.771625 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:30:57.772089 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:57.771754 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:30:57.772089 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:57.771754 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:30:57.772089 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:57.771809 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.771792904 +0000 UTC m=+42.156113204 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:30:57.772089 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:30:57.771824 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:01.771816347 +0000 UTC m=+42.156136645 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:00.434148 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:00.434114 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-gsxc5" event={"ID":"9199c2f9-247b-4dc9-9900-9a6c99aac450","Type":"ContainerStarted","Data":"7d066cd160a50dd75402a699e19cdd86cd95f743a182066d14fcf28a3687f479"} Apr 16 18:31:00.449388 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:00.449346 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-gsxc5" podStartSLOduration=33.91206914 podStartE2EDuration="37.449330066s" podCreationTimestamp="2026-04-16 18:30:23 +0000 UTC" firstStartedPulling="2026-04-16 18:30:56.295642569 +0000 UTC m=+36.679962868" lastFinishedPulling="2026-04-16 18:30:59.832903491 +0000 UTC m=+40.217223794" observedRunningTime="2026-04-16 18:31:00.449255879 +0000 UTC m=+40.833576197" watchObservedRunningTime="2026-04-16 18:31:00.449330066 +0000 UTC m=+40.833650365" Apr 16 18:31:01.797906 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:01.797871 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:31:01.798352 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:01.797960 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:31:01.798352 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:01.797964 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:01.798352 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:01.798055 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:09.798032726 +0000 UTC m=+50.182353038 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:31:01.798352 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:01.798066 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:01.798352 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:01.798117 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:09.798102969 +0000 UTC m=+50.182423280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:09.855079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:09.855037 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:31:09.855584 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:09.855105 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:31:09.855584 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:09.855184 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:09.855584 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:09.855192 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:09.855584 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:09.855246 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:25.855231841 +0000 UTC m=+66.239552144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:09.855584 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:09.855259 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:25.855253839 +0000 UTC m=+66.239574139 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:31:18.410443 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:18.410418 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qg4jj" Apr 16 18:31:25.865289 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:25.865238 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:31:25.865712 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:25.865394 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:25.865712 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:25.865410 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:31:25.865712 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:25.865457 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:57.865441484 +0000 UTC m=+98.249761787 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:31:25.865712 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:25.865475 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:31:25.865712 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:25.865543 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:25.865712 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:25.865567 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:31:57.865558994 +0000 UTC m=+98.249879293 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:31:25.868022 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:25.868007 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:31:25.876023 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:25.876008 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:31:25.876080 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:25.876062 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:29.87605003 +0000 UTC m=+130.260370333 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : secret "metrics-daemon-secret" not found Apr 16 18:31:26.066555 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.066525 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:31:26.069118 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.069103 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:31:26.079401 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.079383 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:31:26.091628 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.091609 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nhqx\" (UniqueName: \"kubernetes.io/projected/62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee-kube-api-access-5nhqx\") pod \"network-check-target-z4lx8\" (UID: \"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee\") " pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:31:26.321891 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.321844 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-7lgq2\"" Apr 16 18:31:26.329846 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.329827 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:31:26.440120 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.440090 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-z4lx8"] Apr 16 18:31:26.443178 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:31:26.443147 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62c9ee4c_19ed_4818_b5b8_5f08ec96d7ee.slice/crio-11615bda5c1d210bf93c0e701b48303565b8849d34986a362817d0cda2ad1b09 WatchSource:0}: Error finding container 11615bda5c1d210bf93c0e701b48303565b8849d34986a362817d0cda2ad1b09: Status 404 returned error can't find the container with id 11615bda5c1d210bf93c0e701b48303565b8849d34986a362817d0cda2ad1b09 Apr 16 18:31:26.479829 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:26.479794 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z4lx8" event={"ID":"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee","Type":"ContainerStarted","Data":"11615bda5c1d210bf93c0e701b48303565b8849d34986a362817d0cda2ad1b09"} Apr 16 18:31:29.486528 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:29.486489 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-z4lx8" event={"ID":"62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee","Type":"ContainerStarted","Data":"5de096cca3788f1da4902cd23220c002b6a89f9eb16d4a0b991dafa644fb7bcd"} Apr 16 18:31:29.486908 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:29.486599 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:31:29.502319 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:29.502277 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-z4lx8" podStartSLOduration=66.927244757 podStartE2EDuration="1m9.502264285s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:31:26.444889218 +0000 UTC m=+66.829209530" lastFinishedPulling="2026-04-16 18:31:29.019908743 +0000 UTC m=+69.404229058" observedRunningTime="2026-04-16 18:31:29.502030987 +0000 UTC m=+69.886351340" watchObservedRunningTime="2026-04-16 18:31:29.502264285 +0000 UTC m=+69.886584606" Apr 16 18:31:57.884202 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:57.884150 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:31:57.884662 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:31:57.884229 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:31:57.884662 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:57.884297 2577 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:31:57.884662 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:57.884306 2577 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:31:57.884662 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:57.884360 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert podName:84c80ef6-0ecb-44ac-9d4b-c6004661b2f5 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:01.884344987 +0000 UTC m=+162.268665291 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert") pod "ingress-canary-f9b8k" (UID: "84c80ef6-0ecb-44ac-9d4b-c6004661b2f5") : secret "canary-serving-cert" not found Apr 16 18:31:57.884662 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:31:57.884375 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls podName:83f97e63-c527-4a65-9e6d-3bb32971b8d9 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:01.884368702 +0000 UTC m=+162.268689001 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls") pod "dns-default-rp9jr" (UID: "83f97e63-c527-4a65-9e6d-3bb32971b8d9") : secret "dns-default-metrics-tls" not found Apr 16 18:32:00.490915 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:00.490887 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-z4lx8" Apr 16 18:32:29.897076 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:29.897024 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:32:29.897582 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:29.897176 2577 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:32:29.897582 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:29.897256 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs podName:aaeaee38-a562-498e-b7ec-505134c92159 nodeName:}" failed. No retries permitted until 2026-04-16 18:34:31.897237384 +0000 UTC m=+252.281557683 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs") pod "network-metrics-daemon-lvnd8" (UID: "aaeaee38-a562-498e-b7ec-505134c92159") : secret "metrics-daemon-secret" not found Apr 16 18:32:31.597898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.597855 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-crf8r"] Apr 16 18:32:31.600571 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.600556 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.602926 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.602907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 18:32:31.603230 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.603214 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-x9g2x\"" Apr 16 18:32:31.604537 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.604463 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 18:32:31.604537 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.604493 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.604695 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.604658 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.610577 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.610560 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 18:32:31.612265 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.612247 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-crf8r"] Apr 16 18:32:31.703617 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.703596 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c"] Apr 16 18:32:31.706229 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.706213 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-665bf87cd4-2mxqq"] Apr 16 18:32:31.706362 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.706345 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.707953 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.707929 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5ef1cb-7b4d-4a71-a177-306862891c7a-tmp\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.708064 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.707958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ed5ef1cb-7b4d-4a71-a177-306862891c7a-snapshots\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.708064 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.707994 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5ef1cb-7b4d-4a71-a177-306862891c7a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.708064 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.708023 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xk5q\" (UniqueName: \"kubernetes.io/projected/ed5ef1cb-7b4d-4a71-a177-306862891c7a-kube-api-access-7xk5q\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.708199 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.708074 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5ef1cb-7b4d-4a71-a177-306862891c7a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.708199 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.708123 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed5ef1cb-7b4d-4a71-a177-306862891c7a-serving-cert\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.708977 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.708962 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.709093 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.709076 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 18:32:31.709449 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.709432 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 18:32:31.709776 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.709763 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.709854 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.709790 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.709854 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.709787 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-7lnvx\"" Apr 16 18:32:31.711772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.711754 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 18:32:31.711895 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.711878 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.712146 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.712133 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 18:32:31.712526 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.712506 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 18:32:31.712624 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.712529 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9l8l5\"" Apr 16 18:32:31.712931 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.712815 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.713019 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.713003 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 18:32:31.720447 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.720424 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c"] Apr 16 18:32:31.721176 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.721157 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-665bf87cd4-2mxqq"] Apr 16 18:32:31.798998 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.798974 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8"] Apr 16 18:32:31.801694 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.801681 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:31.804467 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.804430 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.804604 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.804533 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-2879m\"" Apr 16 18:32:31.804797 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.804651 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-tlltr"] Apr 16 18:32:31.804894 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.804808 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.805021 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.805004 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 18:32:31.807430 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.807405 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:31.808471 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808455 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-stats-auth\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.808551 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808493 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5ef1cb-7b4d-4a71-a177-306862891c7a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.808551 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808534 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xk5q\" (UniqueName: \"kubernetes.io/projected/ed5ef1cb-7b4d-4a71-a177-306862891c7a-kube-api-access-7xk5q\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.808636 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808562 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4v7r\" (UniqueName: \"kubernetes.io/projected/eda65526-e0cd-496b-8e27-579365e644c6-kube-api-access-t4v7r\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.808636 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808585 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f78be17-5c3e-439a-8923-6ef9297713a5-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.808636 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5ef1cb-7b4d-4a71-a177-306862891c7a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.808752 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808653 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-default-certificate\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.808752 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808679 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.808752 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed5ef1cb-7b4d-4a71-a177-306862891c7a-serving-cert\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.808941 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808815 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f78be17-5c3e-439a-8923-6ef9297713a5-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.808941 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808850 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5ef1cb-7b4d-4a71-a177-306862891c7a-tmp\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.808941 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808896 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ed5ef1cb-7b4d-4a71-a177-306862891c7a-snapshots\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.808941 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808926 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl7jf\" (UniqueName: \"kubernetes.io/projected/0f78be17-5c3e-439a-8923-6ef9297713a5-kube-api-access-dl7jf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.809126 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.808956 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.809275 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.809258 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5ef1cb-7b4d-4a71-a177-306862891c7a-tmp\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.809336 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.809316 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5ef1cb-7b4d-4a71-a177-306862891c7a-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.809390 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.809370 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ed5ef1cb-7b4d-4a71-a177-306862891c7a-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.809637 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.809619 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/ed5ef1cb-7b4d-4a71-a177-306862891c7a-snapshots\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.809717 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.809703 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-cfp5k\"" Apr 16 18:32:31.809893 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.809874 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 18:32:31.809962 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.809920 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 18:32:31.810157 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.810139 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:32:31.810238 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.810168 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 18:32:31.811297 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.811279 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed5ef1cb-7b4d-4a71-a177-306862891c7a-serving-cert\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.817361 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.817343 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 18:32:31.824272 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.824253 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8"] Apr 16 18:32:31.827540 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.827518 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-tlltr"] Apr 16 18:32:31.835414 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.835396 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xk5q\" (UniqueName: \"kubernetes.io/projected/ed5ef1cb-7b4d-4a71-a177-306862891c7a-kube-api-access-7xk5q\") pod \"insights-operator-5785d4fcdd-crf8r\" (UID: \"ed5ef1cb-7b4d-4a71-a177-306862891c7a\") " pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.910296 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910221 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4v7r\" (UniqueName: \"kubernetes.io/projected/eda65526-e0cd-496b-8e27-579365e644c6-kube-api-access-t4v7r\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.910296 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910253 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f78be17-5c3e-439a-8923-6ef9297713a5-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.910296 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-default-certificate\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.910531 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910302 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.910531 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910330 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:31.910531 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910403 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f78be17-5c3e-439a-8923-6ef9297713a5-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.910531 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910442 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dl7jf\" (UniqueName: \"kubernetes.io/projected/0f78be17-5c3e-439a-8923-6ef9297713a5-kube-api-access-dl7jf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.910531 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:31.910463 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:31.910531 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910468 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.910531 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:31.910532 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.410510571 +0000 UTC m=+132.794830884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : secret "router-metrics-certs-default" not found Apr 16 18:32:31.910852 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:31.910561 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.41054883 +0000 UTC m=+132.794869129 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:31.910852 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910579 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqgwm\" (UniqueName: \"kubernetes.io/projected/3e2da103-e6f7-418f-b965-51e093a27d1a-kube-api-access-xqgwm\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:31.910852 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910608 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d42c587a-978d-479c-a50c-6817173ee86b-config\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:31.910852 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910628 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d42c587a-978d-479c-a50c-6817173ee86b-trusted-ca\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:31.910852 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910650 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-stats-auth\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.910852 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910692 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d42c587a-978d-479c-a50c-6817173ee86b-serving-cert\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:31.910852 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910722 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dphz\" (UniqueName: \"kubernetes.io/projected/d42c587a-978d-479c-a50c-6817173ee86b-kube-api-access-4dphz\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:31.911153 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.910916 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f78be17-5c3e-439a-8923-6ef9297713a5-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.912087 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.912068 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" Apr 16 18:32:31.913056 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.912712 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0f78be17-5c3e-439a-8923-6ef9297713a5-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.913056 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.912804 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-default-certificate\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.913323 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.913301 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-stats-auth\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:31.919589 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.919570 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl7jf\" (UniqueName: \"kubernetes.io/projected/0f78be17-5c3e-439a-8923-6ef9297713a5-kube-api-access-dl7jf\") pod \"kube-storage-version-migrator-operator-756bb7d76f-p9g8c\" (UID: \"0f78be17-5c3e-439a-8923-6ef9297713a5\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:31.919983 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:31.919964 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4v7r\" (UniqueName: \"kubernetes.io/projected/eda65526-e0cd-496b-8e27-579365e644c6-kube-api-access-t4v7r\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:32.011238 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.011208 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqgwm\" (UniqueName: \"kubernetes.io/projected/3e2da103-e6f7-418f-b965-51e093a27d1a-kube-api-access-xqgwm\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:32.011375 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.011256 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d42c587a-978d-479c-a50c-6817173ee86b-config\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.011375 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.011275 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d42c587a-978d-479c-a50c-6817173ee86b-trusted-ca\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.011375 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.011312 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d42c587a-978d-479c-a50c-6817173ee86b-serving-cert\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.011375 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.011342 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dphz\" (UniqueName: \"kubernetes.io/projected/d42c587a-978d-479c-a50c-6817173ee86b-kube-api-access-4dphz\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.011575 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.011386 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:32.011715 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:32.011694 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:32.011793 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:32.011781 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls podName:3e2da103-e6f7-418f-b965-51e093a27d1a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:32.511759776 +0000 UTC m=+132.896080086 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls") pod "cluster-samples-operator-667775844f-4xbn8" (UID: "3e2da103-e6f7-418f-b965-51e093a27d1a") : secret "samples-operator-tls" not found Apr 16 18:32:32.012017 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.011995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d42c587a-978d-479c-a50c-6817173ee86b-config\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.012142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.012126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d42c587a-978d-479c-a50c-6817173ee86b-trusted-ca\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.013718 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.013700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d42c587a-978d-479c-a50c-6817173ee86b-serving-cert\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.015992 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.015979 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" Apr 16 18:32:32.020054 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.020029 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dphz\" (UniqueName: \"kubernetes.io/projected/d42c587a-978d-479c-a50c-6817173ee86b-kube-api-access-4dphz\") pod \"console-operator-d87b8d5fc-tlltr\" (UID: \"d42c587a-978d-479c-a50c-6817173ee86b\") " pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.020219 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.020201 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqgwm\" (UniqueName: \"kubernetes.io/projected/3e2da103-e6f7-418f-b965-51e093a27d1a-kube-api-access-xqgwm\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:32.025543 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.025514 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-crf8r"] Apr 16 18:32:32.028575 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:32:32.028554 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded5ef1cb_7b4d_4a71_a177_306862891c7a.slice/crio-c9ed14d8684465d14da8c257d89fb92df6944afe5fadb54e44899e34b65c79d5 WatchSource:0}: Error finding container c9ed14d8684465d14da8c257d89fb92df6944afe5fadb54e44899e34b65c79d5: Status 404 returned error can't find the container with id c9ed14d8684465d14da8c257d89fb92df6944afe5fadb54e44899e34b65c79d5 Apr 16 18:32:32.122152 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.122129 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:32.125165 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.125145 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c"] Apr 16 18:32:32.128025 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:32:32.127995 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f78be17_5c3e_439a_8923_6ef9297713a5.slice/crio-71029859224bca3252464302adc3bf6fe8b32721dc09c5ff7db38f896b1c7d1c WatchSource:0}: Error finding container 71029859224bca3252464302adc3bf6fe8b32721dc09c5ff7db38f896b1c7d1c: Status 404 returned error can't find the container with id 71029859224bca3252464302adc3bf6fe8b32721dc09c5ff7db38f896b1c7d1c Apr 16 18:32:32.233900 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.233743 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-tlltr"] Apr 16 18:32:32.236153 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:32:32.236125 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd42c587a_978d_479c_a50c_6817173ee86b.slice/crio-b8cdef4c4f32142b6a60b30eec22f267c8e253a305f187b98a28279b5f064e33 WatchSource:0}: Error finding container b8cdef4c4f32142b6a60b30eec22f267c8e253a305f187b98a28279b5f064e33: Status 404 returned error can't find the container with id b8cdef4c4f32142b6a60b30eec22f267c8e253a305f187b98a28279b5f064e33 Apr 16 18:32:32.414654 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.414626 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:32.414834 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.414687 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:32.414834 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:32.414784 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:33.414771204 +0000 UTC m=+133.799091502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:32.414834 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:32.414783 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:32.414989 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:32.414844 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:33.414827602 +0000 UTC m=+133.799147919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : secret "router-metrics-certs-default" not found Apr 16 18:32:32.516002 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.515923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:32.516152 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:32.516100 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:32.516206 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:32.516180 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls podName:3e2da103-e6f7-418f-b965-51e093a27d1a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:33.51615816 +0000 UTC m=+133.900478462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls") pod "cluster-samples-operator-667775844f-4xbn8" (UID: "3e2da103-e6f7-418f-b965-51e093a27d1a") : secret "samples-operator-tls" not found Apr 16 18:32:32.600383 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.600343 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" event={"ID":"d42c587a-978d-479c-a50c-6817173ee86b","Type":"ContainerStarted","Data":"b8cdef4c4f32142b6a60b30eec22f267c8e253a305f187b98a28279b5f064e33"} Apr 16 18:32:32.601418 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.601389 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" event={"ID":"ed5ef1cb-7b4d-4a71-a177-306862891c7a","Type":"ContainerStarted","Data":"c9ed14d8684465d14da8c257d89fb92df6944afe5fadb54e44899e34b65c79d5"} Apr 16 18:32:32.602461 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:32.602427 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" event={"ID":"0f78be17-5c3e-439a-8923-6ef9297713a5","Type":"ContainerStarted","Data":"71029859224bca3252464302adc3bf6fe8b32721dc09c5ff7db38f896b1c7d1c"} Apr 16 18:32:33.424465 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:33.424415 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:33.424655 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:33.424568 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:33.424655 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:33.424598 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:35.424575321 +0000 UTC m=+135.808895623 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:33.424655 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:33.424649 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:33.424826 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:33.424688 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:35.424677752 +0000 UTC m=+135.808998055 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : secret "router-metrics-certs-default" not found Apr 16 18:32:33.525521 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:33.525478 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:33.525682 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:33.525617 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:33.525747 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:33.525693 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls podName:3e2da103-e6f7-418f-b965-51e093a27d1a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:35.525675679 +0000 UTC m=+135.909995997 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls") pod "cluster-samples-operator-667775844f-4xbn8" (UID: "3e2da103-e6f7-418f-b965-51e093a27d1a") : secret "samples-operator-tls" not found Apr 16 18:32:34.201424 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.201388 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2"] Apr 16 18:32:34.204764 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.204741 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" Apr 16 18:32:34.207437 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.207403 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-dpn2m\"" Apr 16 18:32:34.212365 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.212337 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2"] Apr 16 18:32:34.332614 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.332583 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w75zt\" (UniqueName: \"kubernetes.io/projected/8cde21b7-36c1-4558-9fda-d27494f89492-kube-api-access-w75zt\") pod \"network-check-source-7b678d77c7-x86k2\" (UID: \"8cde21b7-36c1-4558-9fda-d27494f89492\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" Apr 16 18:32:34.433266 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.433230 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w75zt\" (UniqueName: \"kubernetes.io/projected/8cde21b7-36c1-4558-9fda-d27494f89492-kube-api-access-w75zt\") pod \"network-check-source-7b678d77c7-x86k2\" (UID: \"8cde21b7-36c1-4558-9fda-d27494f89492\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" Apr 16 18:32:34.442158 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.442125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w75zt\" (UniqueName: \"kubernetes.io/projected/8cde21b7-36c1-4558-9fda-d27494f89492-kube-api-access-w75zt\") pod \"network-check-source-7b678d77c7-x86k2\" (UID: \"8cde21b7-36c1-4558-9fda-d27494f89492\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" Apr 16 18:32:34.516477 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.516389 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" Apr 16 18:32:34.964367 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:34.964335 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2"] Apr 16 18:32:34.969287 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:32:34.969245 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cde21b7_36c1_4558_9fda_d27494f89492.slice/crio-fb63fbd08fe310670c9e3485298746cc1b7e9e76d5e44b0b97ca458a9d7d7c70 WatchSource:0}: Error finding container fb63fbd08fe310670c9e3485298746cc1b7e9e76d5e44b0b97ca458a9d7d7c70: Status 404 returned error can't find the container with id fb63fbd08fe310670c9e3485298746cc1b7e9e76d5e44b0b97ca458a9d7d7c70 Apr 16 18:32:35.442819 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.442739 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:35.443301 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:35.442949 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:39.442926799 +0000 UTC m=+139.827247110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:35.443301 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.443017 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:35.443301 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:35.443145 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:35.443301 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:35.443182 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:39.443171311 +0000 UTC m=+139.827491618 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : secret "router-metrics-certs-default" not found Apr 16 18:32:35.544275 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.544240 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:35.544465 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:35.544382 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:35.544465 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:35.544442 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls podName:3e2da103-e6f7-418f-b965-51e093a27d1a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:39.544426671 +0000 UTC m=+139.928746975 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls") pod "cluster-samples-operator-667775844f-4xbn8" (UID: "3e2da103-e6f7-418f-b965-51e093a27d1a") : secret "samples-operator-tls" not found Apr 16 18:32:35.612105 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.612075 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/0.log" Apr 16 18:32:35.612252 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.612122 2577 generic.go:358] "Generic (PLEG): container finished" podID="d42c587a-978d-479c-a50c-6817173ee86b" containerID="de7717f21459c7e35915397a2c82eebf9af3750de0876edba247f8511e835d36" exitCode=255 Apr 16 18:32:35.612252 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.612194 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" event={"ID":"d42c587a-978d-479c-a50c-6817173ee86b","Type":"ContainerDied","Data":"de7717f21459c7e35915397a2c82eebf9af3750de0876edba247f8511e835d36"} Apr 16 18:32:35.612486 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.612464 2577 scope.go:117] "RemoveContainer" containerID="de7717f21459c7e35915397a2c82eebf9af3750de0876edba247f8511e835d36" Apr 16 18:32:35.613914 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.613890 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" event={"ID":"ed5ef1cb-7b4d-4a71-a177-306862891c7a","Type":"ContainerStarted","Data":"209bb4a4356b14c6a689f1e245dd18cea7ae4399171bb0a7d8fa7ba78842fce3"} Apr 16 18:32:35.615307 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.615281 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" event={"ID":"0f78be17-5c3e-439a-8923-6ef9297713a5","Type":"ContainerStarted","Data":"62b987df5e524734bb1a87893bcee21929a0dae0ecb9b519e2f7ff306880feda"} Apr 16 18:32:35.616969 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.616946 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" event={"ID":"8cde21b7-36c1-4558-9fda-d27494f89492","Type":"ContainerStarted","Data":"600854a786cbea27df2df47b62fa8076cdc1c86f77ba3e2f48f10501d55a9b28"} Apr 16 18:32:35.617098 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.617082 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" event={"ID":"8cde21b7-36c1-4558-9fda-d27494f89492","Type":"ContainerStarted","Data":"fb63fbd08fe310670c9e3485298746cc1b7e9e76d5e44b0b97ca458a9d7d7c70"} Apr 16 18:32:35.649310 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.649261 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-x86k2" podStartSLOduration=1.649244123 podStartE2EDuration="1.649244123s" podCreationTimestamp="2026-04-16 18:32:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:35.648108529 +0000 UTC m=+136.032428851" watchObservedRunningTime="2026-04-16 18:32:35.649244123 +0000 UTC m=+136.033564443" Apr 16 18:32:35.662874 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.662813 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" podStartSLOduration=1.95492618 podStartE2EDuration="4.662797851s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="2026-04-16 18:32:32.12984769 +0000 UTC m=+132.514167992" lastFinishedPulling="2026-04-16 18:32:34.83771936 +0000 UTC m=+135.222039663" observedRunningTime="2026-04-16 18:32:35.661957458 +0000 UTC m=+136.046277781" watchObservedRunningTime="2026-04-16 18:32:35.662797851 +0000 UTC m=+136.047118174" Apr 16 18:32:35.680230 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:35.680188 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" podStartSLOduration=1.8707993 podStartE2EDuration="4.680177432s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="2026-04-16 18:32:32.030205097 +0000 UTC m=+132.414525397" lastFinishedPulling="2026-04-16 18:32:34.839583217 +0000 UTC m=+135.223903529" observedRunningTime="2026-04-16 18:32:35.679874418 +0000 UTC m=+136.064194734" watchObservedRunningTime="2026-04-16 18:32:35.680177432 +0000 UTC m=+136.064497751" Apr 16 18:32:36.620757 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:36.620730 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/1.log" Apr 16 18:32:36.621161 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:36.621063 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/0.log" Apr 16 18:32:36.621161 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:36.621094 2577 generic.go:358] "Generic (PLEG): container finished" podID="d42c587a-978d-479c-a50c-6817173ee86b" containerID="1310bc1fa94d354564729c9581a19bf75c3cfbfe7a68cf6c742531f67450feff" exitCode=255 Apr 16 18:32:36.621161 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:36.621122 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" event={"ID":"d42c587a-978d-479c-a50c-6817173ee86b","Type":"ContainerDied","Data":"1310bc1fa94d354564729c9581a19bf75c3cfbfe7a68cf6c742531f67450feff"} Apr 16 18:32:36.621270 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:36.621179 2577 scope.go:117] "RemoveContainer" containerID="de7717f21459c7e35915397a2c82eebf9af3750de0876edba247f8511e835d36" Apr 16 18:32:36.621443 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:36.621422 2577 scope.go:117] "RemoveContainer" containerID="1310bc1fa94d354564729c9581a19bf75c3cfbfe7a68cf6c742531f67450feff" Apr 16 18:32:36.621635 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:36.621614 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-tlltr_openshift-console-operator(d42c587a-978d-479c-a50c-6817173ee86b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" podUID="d42c587a-978d-479c-a50c-6817173ee86b" Apr 16 18:32:37.624353 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:37.624321 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/1.log" Apr 16 18:32:37.624724 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:37.624668 2577 scope.go:117] "RemoveContainer" containerID="1310bc1fa94d354564729c9581a19bf75c3cfbfe7a68cf6c742531f67450feff" Apr 16 18:32:37.624841 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:37.624824 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-tlltr_openshift-console-operator(d42c587a-978d-479c-a50c-6817173ee86b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" podUID="d42c587a-978d-479c-a50c-6817173ee86b" Apr 16 18:32:39.037600 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:39.037576 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bc429_c8925d01-c41b-4fb9-b191-1828f898b33e/dns-node-resolver/0.log" Apr 16 18:32:39.477792 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:39.477679 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:39.477792 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:39.477774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:39.478036 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:39.477830 2577 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 18:32:39.478036 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:39.477928 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:47.477913217 +0000 UTC m=+147.862233517 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : secret "router-metrics-certs-default" not found Apr 16 18:32:39.478036 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:39.477952 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle podName:eda65526-e0cd-496b-8e27-579365e644c6 nodeName:}" failed. No retries permitted until 2026-04-16 18:32:47.477938329 +0000 UTC m=+147.862258633 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle") pod "router-default-665bf87cd4-2mxqq" (UID: "eda65526-e0cd-496b-8e27-579365e644c6") : configmap references non-existent config key: service-ca.crt Apr 16 18:32:39.578535 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:39.578487 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:39.578660 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:39.578633 2577 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 18:32:39.578705 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:39.578695 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls podName:3e2da103-e6f7-418f-b965-51e093a27d1a nodeName:}" failed. No retries permitted until 2026-04-16 18:32:47.578678994 +0000 UTC m=+147.962999294 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls") pod "cluster-samples-operator-667775844f-4xbn8" (UID: "3e2da103-e6f7-418f-b965-51e093a27d1a") : secret "samples-operator-tls" not found Apr 16 18:32:40.240747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:40.240721 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z98p5_ce5b9cf1-84a5-4e49-965d-fca8343e6704/node-ca/0.log" Apr 16 18:32:41.639497 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:41.639463 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-p9g8c_0f78be17-5c3e-439a-8923-6ef9297713a5/kube-storage-version-migrator-operator/0.log" Apr 16 18:32:42.122919 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:42.122880 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:42.122919 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:42.122919 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:32:42.123270 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:42.123258 2577 scope.go:117] "RemoveContainer" containerID="1310bc1fa94d354564729c9581a19bf75c3cfbfe7a68cf6c742531f67450feff" Apr 16 18:32:42.123445 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:42.123429 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-tlltr_openshift-console-operator(d42c587a-978d-479c-a50c-6817173ee86b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" podUID="d42c587a-978d-479c-a50c-6817173ee86b" Apr 16 18:32:47.544510 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.544473 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:47.544910 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.544563 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:47.545148 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.545126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda65526-e0cd-496b-8e27-579365e644c6-service-ca-bundle\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:47.546819 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.546800 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/eda65526-e0cd-496b-8e27-579365e644c6-metrics-certs\") pod \"router-default-665bf87cd4-2mxqq\" (UID: \"eda65526-e0cd-496b-8e27-579365e644c6\") " pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:47.621787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.621751 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:47.645017 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.644979 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:47.647583 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.647556 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3e2da103-e6f7-418f-b965-51e093a27d1a-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-4xbn8\" (UID: \"3e2da103-e6f7-418f-b965-51e093a27d1a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:47.711141 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.711109 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" Apr 16 18:32:47.744125 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.744078 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-665bf87cd4-2mxqq"] Apr 16 18:32:47.746959 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:32:47.746921 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeda65526_e0cd_496b_8e27_579365e644c6.slice/crio-25bd27b84b49ea52e739157058f9fadbd6bdf684466b7cfbcc969b195501f487 WatchSource:0}: Error finding container 25bd27b84b49ea52e739157058f9fadbd6bdf684466b7cfbcc969b195501f487: Status 404 returned error can't find the container with id 25bd27b84b49ea52e739157058f9fadbd6bdf684466b7cfbcc969b195501f487 Apr 16 18:32:47.851540 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:47.851514 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8"] Apr 16 18:32:48.650685 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:48.650650 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" event={"ID":"3e2da103-e6f7-418f-b965-51e093a27d1a","Type":"ContainerStarted","Data":"c2f28020b2371f96417e45c386b48c1ccc044a11e5e102e1d56e4d39b943437b"} Apr 16 18:32:48.652073 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:48.652041 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" event={"ID":"eda65526-e0cd-496b-8e27-579365e644c6","Type":"ContainerStarted","Data":"c12b7c4d02aae6f78305c0534dc6968161bc696a1c980c664cdb84c0c5b6f3c0"} Apr 16 18:32:48.652217 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:48.652079 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" event={"ID":"eda65526-e0cd-496b-8e27-579365e644c6","Type":"ContainerStarted","Data":"25bd27b84b49ea52e739157058f9fadbd6bdf684466b7cfbcc969b195501f487"} Apr 16 18:32:48.670622 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:48.670575 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" podStartSLOduration=17.670555325 podStartE2EDuration="17.670555325s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:48.669650171 +0000 UTC m=+149.053970493" watchObservedRunningTime="2026-04-16 18:32:48.670555325 +0000 UTC m=+149.054875918" Apr 16 18:32:49.622914 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:49.622838 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:49.625229 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:49.625202 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:49.656082 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:49.656053 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" event={"ID":"3e2da103-e6f7-418f-b965-51e093a27d1a","Type":"ContainerStarted","Data":"7981c9ce58dd1c583fbe9bd7a1a30f925a00ccd7d620b3c86c350acad93de324"} Apr 16 18:32:49.656464 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:49.656086 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" event={"ID":"3e2da103-e6f7-418f-b965-51e093a27d1a","Type":"ContainerStarted","Data":"4d836a4332c7548f0f23e0e509a509e1b84bc986e36cde6afdf3f2e76b9811ab"} Apr 16 18:32:49.656464 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:49.656273 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:49.657368 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:49.657346 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-665bf87cd4-2mxqq" Apr 16 18:32:49.672546 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:49.672507 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-4xbn8" podStartSLOduration=17.280795726 podStartE2EDuration="18.672495691s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="2026-04-16 18:32:47.902714061 +0000 UTC m=+148.287034363" lastFinishedPulling="2026-04-16 18:32:49.294414015 +0000 UTC m=+149.678734328" observedRunningTime="2026-04-16 18:32:49.672231696 +0000 UTC m=+150.056552018" watchObservedRunningTime="2026-04-16 18:32:49.672495691 +0000 UTC m=+150.056816012" Apr 16 18:32:55.208581 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:55.208542 2577 scope.go:117] "RemoveContainer" containerID="1310bc1fa94d354564729c9581a19bf75c3cfbfe7a68cf6c742531f67450feff" Apr 16 18:32:55.669614 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:55.669587 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:32:55.669993 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:55.669976 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/1.log" Apr 16 18:32:55.670050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:55.670011 2577 generic.go:358] "Generic (PLEG): container finished" podID="d42c587a-978d-479c-a50c-6817173ee86b" containerID="dc83d524da020b946add1a1d2974584da7be64c40a0c405a730ed39a836a1204" exitCode=255 Apr 16 18:32:55.670099 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:55.670078 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" event={"ID":"d42c587a-978d-479c-a50c-6817173ee86b","Type":"ContainerDied","Data":"dc83d524da020b946add1a1d2974584da7be64c40a0c405a730ed39a836a1204"} Apr 16 18:32:55.670141 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:55.670116 2577 scope.go:117] "RemoveContainer" containerID="1310bc1fa94d354564729c9581a19bf75c3cfbfe7a68cf6c742531f67450feff" Apr 16 18:32:55.670478 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:55.670458 2577 scope.go:117] "RemoveContainer" containerID="dc83d524da020b946add1a1d2974584da7be64c40a0c405a730ed39a836a1204" Apr 16 18:32:55.670677 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:55.670651 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-tlltr_openshift-console-operator(d42c587a-978d-479c-a50c-6817173ee86b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" podUID="d42c587a-978d-479c-a50c-6817173ee86b" Apr 16 18:32:56.673310 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:56.673229 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:32:56.996166 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:56.996066 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-rp9jr" podUID="83f97e63-c527-4a65-9e6d-3bb32971b8d9" Apr 16 18:32:57.028453 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:57.028421 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-f9b8k" podUID="84c80ef6-0ecb-44ac-9d4b-c6004661b2f5" Apr 16 18:32:57.223487 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:32:57.223432 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-lvnd8" podUID="aaeaee38-a562-498e-b7ec-505134c92159" Apr 16 18:32:57.675980 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:57.675951 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rp9jr" Apr 16 18:32:57.676394 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:57.675951 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:32:59.342233 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.342206 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hnhgh"] Apr 16 18:32:59.346817 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.346788 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.349997 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.349979 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:32:59.352294 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.352269 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-gqtpm\"" Apr 16 18:32:59.352392 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.352322 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:32:59.357931 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.357910 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hnhgh"] Apr 16 18:32:59.408261 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.408230 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5fc54bfd4f-w9vfc"] Apr 16 18:32:59.411430 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.411411 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.414275 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.414231 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:32:59.414275 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.414247 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:32:59.414275 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.414246 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r7rs7\"" Apr 16 18:32:59.419875 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.418324 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:32:59.425067 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.425046 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:32:59.430815 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.430792 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5fc54bfd4f-w9vfc"] Apr 16 18:32:59.438785 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.438761 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3fedbc8a-b77d-40da-934a-df24d5a83b14-data-volume\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.438899 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.438798 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chqbd\" (UniqueName: \"kubernetes.io/projected/3fedbc8a-b77d-40da-934a-df24d5a83b14-kube-api-access-chqbd\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.438899 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.438827 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3fedbc8a-b77d-40da-934a-df24d5a83b14-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.439049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.438960 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3fedbc8a-b77d-40da-934a-df24d5a83b14-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.439049 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.439009 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3fedbc8a-b77d-40da-934a-df24d5a83b14-crio-socket\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.539400 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/69710d6b-726e-4799-ad71-ecd597010acf-ca-trust-extracted\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539400 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539399 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-bound-sa-token\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539425 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-registry-tls\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539454 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3fedbc8a-b77d-40da-934a-df24d5a83b14-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.539630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539481 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2qsf\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-kube-api-access-g2qsf\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539510 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3fedbc8a-b77d-40da-934a-df24d5a83b14-crio-socket\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.539630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539535 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69710d6b-726e-4799-ad71-ecd597010acf-trusted-ca\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539576 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/69710d6b-726e-4799-ad71-ecd597010acf-installation-pull-secrets\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539630 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539616 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/3fedbc8a-b77d-40da-934a-df24d5a83b14-crio-socket\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.539985 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539652 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/69710d6b-726e-4799-ad71-ecd597010acf-image-registry-private-configuration\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539985 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539719 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3fedbc8a-b77d-40da-934a-df24d5a83b14-data-volume\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.539985 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539774 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chqbd\" (UniqueName: \"kubernetes.io/projected/3fedbc8a-b77d-40da-934a-df24d5a83b14-kube-api-access-chqbd\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.539985 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539801 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/69710d6b-726e-4799-ad71-ecd597010acf-registry-certificates\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.539985 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.539846 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3fedbc8a-b77d-40da-934a-df24d5a83b14-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.540579 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.540561 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/3fedbc8a-b77d-40da-934a-df24d5a83b14-data-volume\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.540814 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.540799 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/3fedbc8a-b77d-40da-934a-df24d5a83b14-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.542092 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.542064 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/3fedbc8a-b77d-40da-934a-df24d5a83b14-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.560112 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.560084 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chqbd\" (UniqueName: \"kubernetes.io/projected/3fedbc8a-b77d-40da-934a-df24d5a83b14-kube-api-access-chqbd\") pod \"insights-runtime-extractor-hnhgh\" (UID: \"3fedbc8a-b77d-40da-934a-df24d5a83b14\") " pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.640665 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640578 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/69710d6b-726e-4799-ad71-ecd597010acf-image-registry-private-configuration\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.640665 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/69710d6b-726e-4799-ad71-ecd597010acf-registry-certificates\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.640895 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640704 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/69710d6b-726e-4799-ad71-ecd597010acf-ca-trust-extracted\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.640895 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640727 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-bound-sa-token\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.640895 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640752 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-registry-tls\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.640895 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640780 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qsf\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-kube-api-access-g2qsf\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.640895 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640806 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69710d6b-726e-4799-ad71-ecd597010acf-trusted-ca\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.641150 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.640906 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/69710d6b-726e-4799-ad71-ecd597010acf-installation-pull-secrets\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.641272 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.641246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/69710d6b-726e-4799-ad71-ecd597010acf-ca-trust-extracted\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.641635 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.641608 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/69710d6b-726e-4799-ad71-ecd597010acf-registry-certificates\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.642032 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.642012 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/69710d6b-726e-4799-ad71-ecd597010acf-trusted-ca\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.643122 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.643095 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/69710d6b-726e-4799-ad71-ecd597010acf-image-registry-private-configuration\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.643262 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.643241 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-registry-tls\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.643306 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.643269 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/69710d6b-726e-4799-ad71-ecd597010acf-installation-pull-secrets\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.653756 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.653738 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-bound-sa-token\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.653894 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.653852 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qsf\" (UniqueName: \"kubernetes.io/projected/69710d6b-726e-4799-ad71-ecd597010acf-kube-api-access-g2qsf\") pod \"image-registry-5fc54bfd4f-w9vfc\" (UID: \"69710d6b-726e-4799-ad71-ecd597010acf\") " pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.656652 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.656638 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hnhgh" Apr 16 18:32:59.722601 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.722571 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:32:59.776719 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.776505 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hnhgh"] Apr 16 18:32:59.780155 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:32:59.779925 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fedbc8a_b77d_40da_934a_df24d5a83b14.slice/crio-581f7703d4db2edcc1bfe96c40bd4a3e1b27f8a9d34512c1749887b9ca0bf844 WatchSource:0}: Error finding container 581f7703d4db2edcc1bfe96c40bd4a3e1b27f8a9d34512c1749887b9ca0bf844: Status 404 returned error can't find the container with id 581f7703d4db2edcc1bfe96c40bd4a3e1b27f8a9d34512c1749887b9ca0bf844 Apr 16 18:32:59.845733 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:32:59.845706 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5fc54bfd4f-w9vfc"] Apr 16 18:32:59.848645 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:32:59.848619 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69710d6b_726e_4799_ad71_ecd597010acf.slice/crio-07de4ed4282eaad9e7d9c7bd25c0263a509d217614bbaea7cadc748f2921a936 WatchSource:0}: Error finding container 07de4ed4282eaad9e7d9c7bd25c0263a509d217614bbaea7cadc748f2921a936: Status 404 returned error can't find the container with id 07de4ed4282eaad9e7d9c7bd25c0263a509d217614bbaea7cadc748f2921a936 Apr 16 18:33:00.684996 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:00.684903 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hnhgh" event={"ID":"3fedbc8a-b77d-40da-934a-df24d5a83b14","Type":"ContainerStarted","Data":"d6bed338e349f70daf1af1bd46332c2fb97bdb839068f55f9705fd3981c499bb"} Apr 16 18:33:00.684996 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:00.684941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hnhgh" event={"ID":"3fedbc8a-b77d-40da-934a-df24d5a83b14","Type":"ContainerStarted","Data":"ad30862e3e38a1ecfe7444a9b6b497a6b41c66c2e447d9c9a08d2d84ef73b738"} Apr 16 18:33:00.684996 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:00.684951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hnhgh" event={"ID":"3fedbc8a-b77d-40da-934a-df24d5a83b14","Type":"ContainerStarted","Data":"581f7703d4db2edcc1bfe96c40bd4a3e1b27f8a9d34512c1749887b9ca0bf844"} Apr 16 18:33:00.686081 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:00.686056 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" event={"ID":"69710d6b-726e-4799-ad71-ecd597010acf","Type":"ContainerStarted","Data":"4dfae0c7b3ad7f47cf1ef8ee63b71cdbb936c51657a621bb2e2bc2951f1c8e7e"} Apr 16 18:33:00.686199 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:00.686090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" event={"ID":"69710d6b-726e-4799-ad71-ecd597010acf","Type":"ContainerStarted","Data":"07de4ed4282eaad9e7d9c7bd25c0263a509d217614bbaea7cadc748f2921a936"} Apr 16 18:33:00.686245 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:00.686202 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:33:00.705876 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:00.705818 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" podStartSLOduration=1.705804833 podStartE2EDuration="1.705804833s" podCreationTimestamp="2026-04-16 18:32:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:00.704478094 +0000 UTC m=+161.088798416" watchObservedRunningTime="2026-04-16 18:33:00.705804833 +0000 UTC m=+161.090125192" Apr 16 18:33:01.960970 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:01.960881 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:33:01.960970 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:01.960934 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:33:01.963172 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:01.963149 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/83f97e63-c527-4a65-9e6d-3bb32971b8d9-metrics-tls\") pod \"dns-default-rp9jr\" (UID: \"83f97e63-c527-4a65-9e6d-3bb32971b8d9\") " pod="openshift-dns/dns-default-rp9jr" Apr 16 18:33:01.963250 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:01.963202 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/84c80ef6-0ecb-44ac-9d4b-c6004661b2f5-cert\") pod \"ingress-canary-f9b8k\" (UID: \"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5\") " pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:33:02.122726 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.122679 2577 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:33:02.122726 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.122731 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:33:02.123076 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.123063 2577 scope.go:117] "RemoveContainer" containerID="dc83d524da020b946add1a1d2974584da7be64c40a0c405a730ed39a836a1204" Apr 16 18:33:02.123261 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:33:02.123246 2577 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-tlltr_openshift-console-operator(d42c587a-978d-479c-a50c-6817173ee86b)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" podUID="d42c587a-978d-479c-a50c-6817173ee86b" Apr 16 18:33:02.178668 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.178636 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-6c796\"" Apr 16 18:33:02.178854 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.178682 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tdfb9\"" Apr 16 18:33:02.187128 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.187105 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-f9b8k" Apr 16 18:33:02.187128 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.187124 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-rp9jr" Apr 16 18:33:02.313750 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.313721 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-f9b8k"] Apr 16 18:33:02.316975 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:02.316947 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84c80ef6_0ecb_44ac_9d4b_c6004661b2f5.slice/crio-646268d78d0bc74de4ca31fbd1a7e3910d9faeba0b1194ae2fe5460fd9eb4271 WatchSource:0}: Error finding container 646268d78d0bc74de4ca31fbd1a7e3910d9faeba0b1194ae2fe5460fd9eb4271: Status 404 returned error can't find the container with id 646268d78d0bc74de4ca31fbd1a7e3910d9faeba0b1194ae2fe5460fd9eb4271 Apr 16 18:33:02.330225 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.330202 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-rp9jr"] Apr 16 18:33:02.333048 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:02.333022 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f97e63_c527_4a65_9e6d_3bb32971b8d9.slice/crio-2acc432f68ed80c723a457313b786fd064d821bf2a1e65919f48f8559cdaae55 WatchSource:0}: Error finding container 2acc432f68ed80c723a457313b786fd064d821bf2a1e65919f48f8559cdaae55: Status 404 returned error can't find the container with id 2acc432f68ed80c723a457313b786fd064d821bf2a1e65919f48f8559cdaae55 Apr 16 18:33:02.698532 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.698492 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f9b8k" event={"ID":"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5","Type":"ContainerStarted","Data":"646268d78d0bc74de4ca31fbd1a7e3910d9faeba0b1194ae2fe5460fd9eb4271"} Apr 16 18:33:02.700519 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.700493 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hnhgh" event={"ID":"3fedbc8a-b77d-40da-934a-df24d5a83b14","Type":"ContainerStarted","Data":"f754ecc634f9b90243e8050739d7de5dae4876848b28ecdf0392c93bffe57f92"} Apr 16 18:33:02.701742 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.701709 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rp9jr" event={"ID":"83f97e63-c527-4a65-9e6d-3bb32971b8d9","Type":"ContainerStarted","Data":"2acc432f68ed80c723a457313b786fd064d821bf2a1e65919f48f8559cdaae55"} Apr 16 18:33:02.720372 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:02.720195 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hnhgh" podStartSLOduration=1.903027354 podStartE2EDuration="3.720181436s" podCreationTimestamp="2026-04-16 18:32:59 +0000 UTC" firstStartedPulling="2026-04-16 18:32:59.843026098 +0000 UTC m=+160.227346397" lastFinishedPulling="2026-04-16 18:33:01.660180176 +0000 UTC m=+162.044500479" observedRunningTime="2026-04-16 18:33:02.720045128 +0000 UTC m=+163.104365449" watchObservedRunningTime="2026-04-16 18:33:02.720181436 +0000 UTC m=+163.104501757" Apr 16 18:33:04.709977 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:04.709936 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rp9jr" event={"ID":"83f97e63-c527-4a65-9e6d-3bb32971b8d9","Type":"ContainerStarted","Data":"275d9fca8cbaf35d42aeaa22301ef42aef9237bde072bfab49ed9e8b17b3dfe2"} Apr 16 18:33:04.709977 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:04.709979 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-rp9jr" event={"ID":"83f97e63-c527-4a65-9e6d-3bb32971b8d9","Type":"ContainerStarted","Data":"587a44f10813e01bf6b7020074b1253516b391d9981d8a0bfb6da8dad23b3d94"} Apr 16 18:33:04.710477 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:04.710030 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-rp9jr" Apr 16 18:33:04.711180 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:04.711146 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-f9b8k" event={"ID":"84c80ef6-0ecb-44ac-9d4b-c6004661b2f5","Type":"ContainerStarted","Data":"9d59456c33fedcfdc572b99ba1899b2f78735602bc1f60d506f6f8a4016c1c74"} Apr 16 18:33:04.726577 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:04.726526 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-rp9jr" podStartSLOduration=130.053728149 podStartE2EDuration="2m11.726511239s" podCreationTimestamp="2026-04-16 18:30:53 +0000 UTC" firstStartedPulling="2026-04-16 18:33:02.334754889 +0000 UTC m=+162.719075190" lastFinishedPulling="2026-04-16 18:33:04.007537974 +0000 UTC m=+164.391858280" observedRunningTime="2026-04-16 18:33:04.725977151 +0000 UTC m=+165.110297484" watchObservedRunningTime="2026-04-16 18:33:04.726511239 +0000 UTC m=+165.110831561" Apr 16 18:33:04.742447 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:04.742392 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-f9b8k" podStartSLOduration=130.049310735 podStartE2EDuration="2m11.742380332s" podCreationTimestamp="2026-04-16 18:30:53 +0000 UTC" firstStartedPulling="2026-04-16 18:33:02.31874122 +0000 UTC m=+162.703061520" lastFinishedPulling="2026-04-16 18:33:04.011810815 +0000 UTC m=+164.396131117" observedRunningTime="2026-04-16 18:33:04.741754923 +0000 UTC m=+165.126075243" watchObservedRunningTime="2026-04-16 18:33:04.742380332 +0000 UTC m=+165.126700653" Apr 16 18:33:08.208475 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:08.208402 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:33:14.346697 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.346663 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb"] Apr 16 18:33:14.349821 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.349805 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.353123 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.353101 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:33:14.353250 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.353141 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:33:14.353250 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.353235 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-s8vn8\"" Apr 16 18:33:14.354480 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.354453 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:33:14.354480 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.354479 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:33:14.354795 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.354582 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:33:14.361079 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.361062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb"] Apr 16 18:33:14.399092 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.399065 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2ctl8"] Apr 16 18:33:14.402080 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.402066 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.405947 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.405927 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:33:14.406051 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.405930 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-x6np6\"" Apr 16 18:33:14.406051 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.405956 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:33:14.406223 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.406209 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:33:14.457278 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457253 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.457410 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457283 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-textfile\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457410 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457310 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0efab944-8c28-42de-a3d8-4656f60b4334-metrics-client-ca\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457410 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457375 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-wtmp\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457559 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457559 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457469 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.457559 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457502 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.457559 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457534 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-tls\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457559 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457558 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5759x\" (UniqueName: \"kubernetes.io/projected/0efab944-8c28-42de-a3d8-4656f60b4334-kube-api-access-5759x\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457606 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-sys\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457664 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-root\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.457787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457696 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx5jl\" (UniqueName: \"kubernetes.io/projected/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-kube-api-access-rx5jl\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.457787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.457727 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-accelerators-collector-config\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.558558 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558522 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-sys\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558571 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-root\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558593 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rx5jl\" (UniqueName: \"kubernetes.io/projected/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-kube-api-access-rx5jl\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-accelerators-collector-config\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-root\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558643 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558675 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-textfile\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558639 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-sys\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.558720 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0efab944-8c28-42de-a3d8-4656f60b4334-metrics-client-ca\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:33:14.558734 2577 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:33:14.558787 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-tls podName:fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:15.058771975 +0000 UTC m=+175.443092273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-tls") pod "openshift-state-metrics-5669946b84-4hnkb" (UID: "fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2") : secret "openshift-state-metrics-tls" not found Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558733 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-wtmp\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558877 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-wtmp\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558876 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558956 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.558996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-tls\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.559021 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5759x\" (UniqueName: \"kubernetes.io/projected/0efab944-8c28-42de-a3d8-4656f60b4334-kube-api-access-5759x\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559192 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.559138 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-textfile\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559604 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:33:14.559233 2577 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 18:33:14.559604 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.559296 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-accelerators-collector-config\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.559604 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:33:14.559314 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-tls podName:0efab944-8c28-42de-a3d8-4656f60b4334 nodeName:}" failed. No retries permitted until 2026-04-16 18:33:15.059295061 +0000 UTC m=+175.443615368 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-tls") pod "node-exporter-2ctl8" (UID: "0efab944-8c28-42de-a3d8-4656f60b4334") : secret "node-exporter-tls" not found Apr 16 18:33:14.559604 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.559351 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0efab944-8c28-42de-a3d8-4656f60b4334-metrics-client-ca\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.560352 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.560330 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.561272 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.561250 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.561358 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.561268 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.567357 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.567336 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5759x\" (UniqueName: \"kubernetes.io/projected/0efab944-8c28-42de-a3d8-4656f60b4334-kube-api-access-5759x\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:14.567877 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.567838 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx5jl\" (UniqueName: \"kubernetes.io/projected/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-kube-api-access-rx5jl\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:14.716782 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:14.716700 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-rp9jr" Apr 16 18:33:15.062604 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.062573 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-tls\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:15.062772 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.062638 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:15.064975 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.064948 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0efab944-8c28-42de-a3d8-4656f60b4334-node-exporter-tls\") pod \"node-exporter-2ctl8\" (UID: \"0efab944-8c28-42de-a3d8-4656f60b4334\") " pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:15.065088 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.065026 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-4hnkb\" (UID: \"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:15.258954 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.258915 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" Apr 16 18:33:15.310880 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.310840 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2ctl8" Apr 16 18:33:15.322771 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:15.322680 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0efab944_8c28_42de_a3d8_4656f60b4334.slice/crio-c0a4a446da4ae6bf532731f17e3c743aff8dbc0b8be3cf6c38195b671124c0f2 WatchSource:0}: Error finding container c0a4a446da4ae6bf532731f17e3c743aff8dbc0b8be3cf6c38195b671124c0f2: Status 404 returned error can't find the container with id c0a4a446da4ae6bf532731f17e3c743aff8dbc0b8be3cf6c38195b671124c0f2 Apr 16 18:33:15.386455 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.386431 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb"] Apr 16 18:33:15.388787 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:15.388754 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc1f1d4b_ac82_4689_8a2d_f0ddfc64d3a2.slice/crio-604acbb928e85a53b3d6080b8a535401677f61cf81a91ed0e3aa686c25a833c1 WatchSource:0}: Error finding container 604acbb928e85a53b3d6080b8a535401677f61cf81a91ed0e3aa686c25a833c1: Status 404 returned error can't find the container with id 604acbb928e85a53b3d6080b8a535401677f61cf81a91ed0e3aa686c25a833c1 Apr 16 18:33:15.443280 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.443242 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:33:15.446539 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.446520 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.449214 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449185 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:33:15.449312 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449241 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:33:15.449312 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449250 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:33:15.449312 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449269 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:33:15.449672 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449655 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:33:15.449763 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449678 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:33:15.449763 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449688 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:33:15.449996 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.449981 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ckbh2\"" Apr 16 18:33:15.450074 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.450065 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:33:15.450398 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.450379 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:33:15.462994 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.462972 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:33:15.568347 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568277 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568347 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568311 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4q7v\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-kube-api-access-r4q7v\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568525 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568351 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-volume\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568525 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568379 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568525 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568423 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568525 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568447 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568525 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568465 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-web-config\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568525 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568505 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568525 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568626 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568695 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568751 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.568822 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.568791 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-out\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669217 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669184 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669217 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669222 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669431 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669248 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-out\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669431 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669357 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669431 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:33:15.669375 2577 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle podName:80ecd6f8-0bb4-4f55-a4e1-6714f26d293e nodeName:}" failed. No retries permitted until 2026-04-16 18:33:16.169353256 +0000 UTC m=+176.553673557 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e") : configmap references non-existent config key: ca-bundle.crt Apr 16 18:33:15.669431 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669417 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4q7v\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-kube-api-access-r4q7v\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669461 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-volume\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669490 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669540 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669579 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669612 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-web-config\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.669661 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669641 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.670009 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669669 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.670009 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.669726 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.670133 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.670113 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.670651 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.670629 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.672359 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.672334 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-out\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.672463 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.672340 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.672590 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.672571 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.672726 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.672700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.673644 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.673606 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.673644 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.673625 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-volume\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.673789 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.673618 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.673994 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.673972 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.674387 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.674367 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-web-config\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.677524 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.677497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4q7v\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-kube-api-access-r4q7v\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:15.741965 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.741920 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2ctl8" event={"ID":"0efab944-8c28-42de-a3d8-4656f60b4334","Type":"ContainerStarted","Data":"c0a4a446da4ae6bf532731f17e3c743aff8dbc0b8be3cf6c38195b671124c0f2"} Apr 16 18:33:15.743648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.743614 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" event={"ID":"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2","Type":"ContainerStarted","Data":"8d921f50b766ce46943456834b531a23cda33bfc36404723042b8d9a108359fe"} Apr 16 18:33:15.743790 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.743653 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" event={"ID":"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2","Type":"ContainerStarted","Data":"cc9c3f5a966c7cf0e849e5e6113764c1c1348ffc766f0057344125b8ba96ce2a"} Apr 16 18:33:15.743790 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:15.743669 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" event={"ID":"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2","Type":"ContainerStarted","Data":"604acbb928e85a53b3d6080b8a535401677f61cf81a91ed0e3aa686c25a833c1"} Apr 16 18:33:16.173682 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.173657 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:16.174407 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.174383 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:16.209097 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.209072 2577 scope.go:117] "RemoveContainer" containerID="dc83d524da020b946add1a1d2974584da7be64c40a0c405a730ed39a836a1204" Apr 16 18:33:16.356393 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.356366 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:33:16.492983 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.492908 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:33:16.497801 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:16.497773 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80ecd6f8_0bb4_4f55_a4e1_6714f26d293e.slice/crio-2633c9227fd9e60bd73fb6aef8bbde9256c3e0364471f2343c5b52e2bb779e2e WatchSource:0}: Error finding container 2633c9227fd9e60bd73fb6aef8bbde9256c3e0364471f2343c5b52e2bb779e2e: Status 404 returned error can't find the container with id 2633c9227fd9e60bd73fb6aef8bbde9256c3e0364471f2343c5b52e2bb779e2e Apr 16 18:33:16.748045 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.747952 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerStarted","Data":"2633c9227fd9e60bd73fb6aef8bbde9256c3e0364471f2343c5b52e2bb779e2e"} Apr 16 18:33:16.749501 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.749483 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:33:16.749619 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.749555 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" event={"ID":"d42c587a-978d-479c-a50c-6817173ee86b","Type":"ContainerStarted","Data":"f42fa9a6c7c0b2687c74aa509802a9735db005c609645e30e1eccf6f2fed5744"} Apr 16 18:33:16.749911 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.749885 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:33:16.751052 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.751030 2577 generic.go:358] "Generic (PLEG): container finished" podID="0efab944-8c28-42de-a3d8-4656f60b4334" containerID="42253518d8dcb020d20c07490117557fdd57975402d846d022cc5295e90e0fc8" exitCode=0 Apr 16 18:33:16.751174 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.751090 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2ctl8" event={"ID":"0efab944-8c28-42de-a3d8-4656f60b4334","Type":"ContainerDied","Data":"42253518d8dcb020d20c07490117557fdd57975402d846d022cc5295e90e0fc8"} Apr 16 18:33:16.752973 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.752950 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" event={"ID":"fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2","Type":"ContainerStarted","Data":"1d573e130cde6fb4e1b957ab55f76675717857c03290767e9ed961b080461876"} Apr 16 18:33:16.769575 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.769534 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" podStartSLOduration=43.166118472 podStartE2EDuration="45.769523048s" podCreationTimestamp="2026-04-16 18:32:31 +0000 UTC" firstStartedPulling="2026-04-16 18:32:32.237818361 +0000 UTC m=+132.622138660" lastFinishedPulling="2026-04-16 18:32:34.841222935 +0000 UTC m=+135.225543236" observedRunningTime="2026-04-16 18:33:16.768136611 +0000 UTC m=+177.152456944" watchObservedRunningTime="2026-04-16 18:33:16.769523048 +0000 UTC m=+177.153843369" Apr 16 18:33:16.788951 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:16.788906 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-4hnkb" podStartSLOduration=1.907819205 podStartE2EDuration="2.788891841s" podCreationTimestamp="2026-04-16 18:33:14 +0000 UTC" firstStartedPulling="2026-04-16 18:33:15.518745647 +0000 UTC m=+175.903065948" lastFinishedPulling="2026-04-16 18:33:16.399818282 +0000 UTC m=+176.784138584" observedRunningTime="2026-04-16 18:33:16.787755445 +0000 UTC m=+177.172075764" watchObservedRunningTime="2026-04-16 18:33:16.788891841 +0000 UTC m=+177.173212167" Apr 16 18:33:17.465722 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.465695 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7fd49f5b86-966k5"] Apr 16 18:33:17.470346 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.470319 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.472875 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.472844 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:33:17.473044 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.472907 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:33:17.473144 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.473123 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:33:17.473436 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.473359 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-dlql8\"" Apr 16 18:33:17.473436 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.473382 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:33:17.473436 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.473424 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ad04jfnd6tll\"" Apr 16 18:33:17.473640 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.473469 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:33:17.481648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.481628 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fd49f5b86-966k5"] Apr 16 18:33:17.585766 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.585738 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-grpc-tls\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.586142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.585779 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c844758-a68e-4383-8692-465ff4d4d319-metrics-client-ca\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.586142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.585917 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-tls\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.586142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.585973 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.586142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.586006 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.586142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.586071 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.586142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.586098 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.586142 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.586118 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72s6\" (UniqueName: \"kubernetes.io/projected/4c844758-a68e-4383-8692-465ff4d4d319-kube-api-access-f72s6\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687427 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687349 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-tls\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687427 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687388 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687634 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687429 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687634 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687472 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687634 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687588 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687636 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f72s6\" (UniqueName: \"kubernetes.io/projected/4c844758-a68e-4383-8692-465ff4d4d319-kube-api-access-f72s6\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687671 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-grpc-tls\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.687787 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.687707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c844758-a68e-4383-8692-465ff4d4d319-metrics-client-ca\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.688484 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.688457 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c844758-a68e-4383-8692-465ff4d4d319-metrics-client-ca\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.690203 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.690178 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.690601 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.690325 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.690601 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.690428 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-tls\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.690601 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.690516 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.690877 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.690839 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-grpc-tls\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.691180 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.691160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/4c844758-a68e-4383-8692-465ff4d4d319-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.695026 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.695006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72s6\" (UniqueName: \"kubernetes.io/projected/4c844758-a68e-4383-8692-465ff4d4d319-kube-api-access-f72s6\") pod \"thanos-querier-7fd49f5b86-966k5\" (UID: \"4c844758-a68e-4383-8692-465ff4d4d319\") " pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.750815 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.750775 2577 patch_prober.go:28] interesting pod/console-operator-d87b8d5fc-tlltr container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.133.0.12:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 16 18:33:17.751018 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.750850 2577 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" podUID="d42c587a-978d-479c-a50c-6817173ee86b" containerName="console-operator" probeResult="failure" output="Get \"https://10.133.0.12:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 16 18:33:17.757761 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.757730 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2ctl8" event={"ID":"0efab944-8c28-42de-a3d8-4656f60b4334","Type":"ContainerStarted","Data":"f3f540727df255de0a13357c854ba7e5d5b5e2fa3b692aa8513b8967e6b6af3f"} Apr 16 18:33:17.757905 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.757768 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2ctl8" event={"ID":"0efab944-8c28-42de-a3d8-4656f60b4334","Type":"ContainerStarted","Data":"f0fb91ee012492d6a4c741dea48fab1445bab82f5c2f9b6657c4b5989d2e4b9c"} Apr 16 18:33:17.759025 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.759002 2577 generic.go:358] "Generic (PLEG): container finished" podID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerID="1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758" exitCode=0 Apr 16 18:33:17.759143 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.759087 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758"} Apr 16 18:33:17.779815 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.779794 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:17.790560 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.790515 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2ctl8" podStartSLOduration=3.097724696 podStartE2EDuration="3.790501797s" podCreationTimestamp="2026-04-16 18:33:14 +0000 UTC" firstStartedPulling="2026-04-16 18:33:15.325486834 +0000 UTC m=+175.709807147" lastFinishedPulling="2026-04-16 18:33:16.018263942 +0000 UTC m=+176.402584248" observedRunningTime="2026-04-16 18:33:17.789964538 +0000 UTC m=+178.174284858" watchObservedRunningTime="2026-04-16 18:33:17.790501797 +0000 UTC m=+178.174822119" Apr 16 18:33:17.798223 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.798205 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-tlltr" Apr 16 18:33:17.940100 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.939690 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7fd49f5b86-966k5"] Apr 16 18:33:17.943909 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:17.943882 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c844758_a68e_4383_8692_465ff4d4d319.slice/crio-05290cefedc6582475fd9e973812f7aa4b0fd2b4f3b3aeded2ca90f59a35f18c WatchSource:0}: Error finding container 05290cefedc6582475fd9e973812f7aa4b0fd2b4f3b3aeded2ca90f59a35f18c: Status 404 returned error can't find the container with id 05290cefedc6582475fd9e973812f7aa4b0fd2b4f3b3aeded2ca90f59a35f18c Apr 16 18:33:17.980966 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.980937 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-z4whn"] Apr 16 18:33:17.985346 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.985328 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-z4whn" Apr 16 18:33:17.989633 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.989614 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-twf6q\"" Apr 16 18:33:17.990447 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.990426 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 18:33:17.990552 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:17.990434 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 18:33:18.017946 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.017925 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-z4whn"] Apr 16 18:33:18.093659 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.093623 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdksv\" (UniqueName: \"kubernetes.io/projected/584eb437-56ab-4653-bf73-77111eabe484-kube-api-access-xdksv\") pod \"downloads-586b57c7b4-z4whn\" (UID: \"584eb437-56ab-4653-bf73-77111eabe484\") " pod="openshift-console/downloads-586b57c7b4-z4whn" Apr 16 18:33:18.195119 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.195028 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdksv\" (UniqueName: \"kubernetes.io/projected/584eb437-56ab-4653-bf73-77111eabe484-kube-api-access-xdksv\") pod \"downloads-586b57c7b4-z4whn\" (UID: \"584eb437-56ab-4653-bf73-77111eabe484\") " pod="openshift-console/downloads-586b57c7b4-z4whn" Apr 16 18:33:18.203490 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.203454 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdksv\" (UniqueName: \"kubernetes.io/projected/584eb437-56ab-4653-bf73-77111eabe484-kube-api-access-xdksv\") pod \"downloads-586b57c7b4-z4whn\" (UID: \"584eb437-56ab-4653-bf73-77111eabe484\") " pod="openshift-console/downloads-586b57c7b4-z4whn" Apr 16 18:33:18.295550 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.295509 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-z4whn" Apr 16 18:33:18.432109 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.431739 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-z4whn"] Apr 16 18:33:18.434714 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:18.434683 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod584eb437_56ab_4653_bf73_77111eabe484.slice/crio-d701599a19563e9684476bb0402b85c7f84d546ed8b51d7e4d42355091adb638 WatchSource:0}: Error finding container d701599a19563e9684476bb0402b85c7f84d546ed8b51d7e4d42355091adb638: Status 404 returned error can't find the container with id d701599a19563e9684476bb0402b85c7f84d546ed8b51d7e4d42355091adb638 Apr 16 18:33:18.764629 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.764589 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" event={"ID":"4c844758-a68e-4383-8692-465ff4d4d319","Type":"ContainerStarted","Data":"05290cefedc6582475fd9e973812f7aa4b0fd2b4f3b3aeded2ca90f59a35f18c"} Apr 16 18:33:18.766518 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:18.766478 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-z4whn" event={"ID":"584eb437-56ab-4653-bf73-77111eabe484","Type":"ContainerStarted","Data":"d701599a19563e9684476bb0402b85c7f84d546ed8b51d7e4d42355091adb638"} Apr 16 18:33:19.728712 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:19.728673 2577 patch_prober.go:28] interesting pod/image-registry-5fc54bfd4f-w9vfc container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 18:33:19.728951 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:19.728740 2577 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" podUID="69710d6b-726e-4799-ad71-ecd597010acf" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:19.774069 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:19.774032 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerStarted","Data":"15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad"} Apr 16 18:33:19.774549 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:19.774080 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerStarted","Data":"2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde"} Apr 16 18:33:19.774549 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:19.774096 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerStarted","Data":"bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8"} Apr 16 18:33:20.541515 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.541479 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:33:20.548988 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.548959 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.551744 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.551720 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:33:20.551744 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.551735 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:33:20.551975 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.551760 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:33:20.552944 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.552924 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:33:20.552944 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.552935 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:33:20.553105 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553062 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:33:20.553105 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553069 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5cjtgvcnbsp9v\"" Apr 16 18:33:20.553226 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553200 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:33:20.553369 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553307 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dqvpv\"" Apr 16 18:33:20.553638 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553596 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:33:20.553734 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553705 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:33:20.553734 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553726 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:33:20.553840 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.553743 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:33:20.556251 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.556223 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:33:20.558579 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.558513 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:33:20.560542 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.560520 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:33:20.722278 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722242 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722444 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722291 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-config\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722444 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722333 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722444 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722358 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-web-config\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722444 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722385 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722444 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722413 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfpx\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-kube-api-access-6zfpx\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722701 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722463 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722701 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722524 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722701 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722592 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722701 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722622 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722701 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722660 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722950 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722703 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722950 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722734 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722950 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722759 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722950 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722794 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722950 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722818 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.722950 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722871 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.723243 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.722987 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-config-out\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.780650 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.780620 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerStarted","Data":"a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68"} Apr 16 18:33:20.781013 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.780660 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerStarted","Data":"551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4"} Apr 16 18:33:20.782570 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.782543 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" event={"ID":"4c844758-a68e-4383-8692-465ff4d4d319","Type":"ContainerStarted","Data":"11c9bcb89b7e4af636bfa6841777fc1aafb638b65ef637ac5b4729de993234b7"} Apr 16 18:33:20.782686 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.782580 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" event={"ID":"4c844758-a68e-4383-8692-465ff4d4d319","Type":"ContainerStarted","Data":"c8f4bcc87ef0c4214357b19c8d192b90983aaef6a01117a39eaa782c944505e3"} Apr 16 18:33:20.782686 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.782595 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" event={"ID":"4c844758-a68e-4383-8692-465ff4d4d319","Type":"ContainerStarted","Data":"f3991a0a92ede1707dffdb82814f58ef7a29baf33aa0de1ad731d73a38123b4d"} Apr 16 18:33:20.824234 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824202 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824346 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824249 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824346 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824277 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824346 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824305 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824346 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824339 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824558 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824365 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824558 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824412 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824558 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824479 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-config-out\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824558 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824514 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824559 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-config\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824596 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824619 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-web-config\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824649 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824676 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfpx\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-kube-api-access-6zfpx\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.824753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824746 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.825109 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824776 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.825109 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.824804 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.825109 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.825015 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.825254 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.825160 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.825254 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.825170 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.827933 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.827357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.827933 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.827589 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.829270 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.829242 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.831605 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.831579 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.833762 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.833703 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.833762 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.833710 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfpx\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-kube-api-access-6zfpx\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.834896 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.834126 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.834896 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.834161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-web-config\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.834896 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.834356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-config-out\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.834896 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.834438 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.834896 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.834770 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-config\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.834896 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.834842 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.835308 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.835246 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.836330 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.835677 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.836827 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.836784 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:20.862156 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:20.862128 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:21.013658 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.013623 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:33:21.016462 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:33:21.016429 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51371cc3_9f06_48bb_90f9_389b162a65b4.slice/crio-607208b0086deaff88fdb846e2da2361cb5e67e12a27392406589f3497dca1dd WatchSource:0}: Error finding container 607208b0086deaff88fdb846e2da2361cb5e67e12a27392406589f3497dca1dd: Status 404 returned error can't find the container with id 607208b0086deaff88fdb846e2da2361cb5e67e12a27392406589f3497dca1dd Apr 16 18:33:21.693165 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.693129 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5fc54bfd4f-w9vfc" Apr 16 18:33:21.793963 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.793925 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" event={"ID":"4c844758-a68e-4383-8692-465ff4d4d319","Type":"ContainerStarted","Data":"75ac0fc3628eb729631a983b8fbe44764fa3a1925f7c4a3062c3e195e0d316d6"} Apr 16 18:33:21.794404 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.793971 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" event={"ID":"4c844758-a68e-4383-8692-465ff4d4d319","Type":"ContainerStarted","Data":"d6e5e3d0faf375765512e09681fa6912f873c546e22a6535a2ae5526772f382d"} Apr 16 18:33:21.794404 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.793987 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" event={"ID":"4c844758-a68e-4383-8692-465ff4d4d319","Type":"ContainerStarted","Data":"82dd2a993ac25e57a418906c130138fab4f2fbb527cc75c3943ea4a7c5e55b5e"} Apr 16 18:33:21.794404 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.794095 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:21.797536 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.797502 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerStarted","Data":"72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde"} Apr 16 18:33:21.799127 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.799095 2577 generic.go:358] "Generic (PLEG): container finished" podID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerID="8f6aa730fcc5ebf1d3dad4006d9099c683af08afff3dd831f8870038f67f0864" exitCode=0 Apr 16 18:33:21.799259 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.799127 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"8f6aa730fcc5ebf1d3dad4006d9099c683af08afff3dd831f8870038f67f0864"} Apr 16 18:33:21.799259 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.799159 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerStarted","Data":"607208b0086deaff88fdb846e2da2361cb5e67e12a27392406589f3497dca1dd"} Apr 16 18:33:21.814199 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.814156 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" podStartSLOduration=2.026834401 podStartE2EDuration="4.814140267s" podCreationTimestamp="2026-04-16 18:33:17 +0000 UTC" firstStartedPulling="2026-04-16 18:33:17.946813135 +0000 UTC m=+178.331133433" lastFinishedPulling="2026-04-16 18:33:20.734118986 +0000 UTC m=+181.118439299" observedRunningTime="2026-04-16 18:33:21.81335011 +0000 UTC m=+182.197670430" watchObservedRunningTime="2026-04-16 18:33:21.814140267 +0000 UTC m=+182.198460589" Apr 16 18:33:21.873955 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:21.873888 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.640872061 podStartE2EDuration="6.873853936s" podCreationTimestamp="2026-04-16 18:33:15 +0000 UTC" firstStartedPulling="2026-04-16 18:33:16.500483955 +0000 UTC m=+176.884804253" lastFinishedPulling="2026-04-16 18:33:20.73346583 +0000 UTC m=+181.117786128" observedRunningTime="2026-04-16 18:33:21.871946778 +0000 UTC m=+182.256267100" watchObservedRunningTime="2026-04-16 18:33:21.873853936 +0000 UTC m=+182.258174258" Apr 16 18:33:24.812086 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:24.812058 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerStarted","Data":"b4b5537582e1c06b6eac70578a12ad3732433a9005e7b2ac8fb5ec4d0f58b765"} Apr 16 18:33:24.812398 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:24.812108 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerStarted","Data":"09c9d89ec63c443eb9c9682061cd4544c08349d58646cbd8166ceec1c9be7487"} Apr 16 18:33:25.820332 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:25.820292 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerStarted","Data":"418399d27ca05e6592dc07704dc5bde1cff301921892a87984dc24d37c29e6c5"} Apr 16 18:33:25.820332 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:25.820338 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerStarted","Data":"22bfc34d05c2f823b1f9379f826a90ad91696b30690369fb95fe14e7f71d46a9"} Apr 16 18:33:25.820830 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:25.820353 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerStarted","Data":"9ff0c4aaebbd57e393a32eca496f12329bef60ac19baef7b09752d50a825bdca"} Apr 16 18:33:25.820830 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:25.820365 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerStarted","Data":"11adeb782d2e78cfc87e0afd26009e5fc7fa3ec052d337622201be08d0a04501"} Apr 16 18:33:25.847140 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:25.847048 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.01342186 podStartE2EDuration="5.847028012s" podCreationTimestamp="2026-04-16 18:33:20 +0000 UTC" firstStartedPulling="2026-04-16 18:33:21.80043101 +0000 UTC m=+182.184751308" lastFinishedPulling="2026-04-16 18:33:24.634037158 +0000 UTC m=+185.018357460" observedRunningTime="2026-04-16 18:33:25.845380576 +0000 UTC m=+186.229700899" watchObservedRunningTime="2026-04-16 18:33:25.847028012 +0000 UTC m=+186.231348334" Apr 16 18:33:25.862998 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:25.862971 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:33:27.809287 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:27.809254 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7fd49f5b86-966k5" Apr 16 18:33:35.859441 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:35.859355 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-z4whn" event={"ID":"584eb437-56ab-4653-bf73-77111eabe484","Type":"ContainerStarted","Data":"cd7e5d77ab50049d3dc6aef3d00e2d852b9c61dd2dac712650c10a58d71d8b23"} Apr 16 18:33:35.860037 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:35.860009 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-z4whn" Apr 16 18:33:35.878238 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:35.878188 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-z4whn" podStartSLOduration=1.736303414 podStartE2EDuration="18.878169384s" podCreationTimestamp="2026-04-16 18:33:17 +0000 UTC" firstStartedPulling="2026-04-16 18:33:18.436942173 +0000 UTC m=+178.821262477" lastFinishedPulling="2026-04-16 18:33:35.578808145 +0000 UTC m=+195.963128447" observedRunningTime="2026-04-16 18:33:35.875776042 +0000 UTC m=+196.260096367" watchObservedRunningTime="2026-04-16 18:33:35.878169384 +0000 UTC m=+196.262489717" Apr 16 18:33:35.879447 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:33:35.879423 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-z4whn" Apr 16 18:34:05.951128 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:05.951096 2577 generic.go:358] "Generic (PLEG): container finished" podID="ed5ef1cb-7b4d-4a71-a177-306862891c7a" containerID="209bb4a4356b14c6a689f1e245dd18cea7ae4399171bb0a7d8fa7ba78842fce3" exitCode=0 Apr 16 18:34:05.951640 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:05.951178 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" event={"ID":"ed5ef1cb-7b4d-4a71-a177-306862891c7a","Type":"ContainerDied","Data":"209bb4a4356b14c6a689f1e245dd18cea7ae4399171bb0a7d8fa7ba78842fce3"} Apr 16 18:34:05.951640 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:05.951596 2577 scope.go:117] "RemoveContainer" containerID="209bb4a4356b14c6a689f1e245dd18cea7ae4399171bb0a7d8fa7ba78842fce3" Apr 16 18:34:05.952511 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:05.952494 2577 generic.go:358] "Generic (PLEG): container finished" podID="0f78be17-5c3e-439a-8923-6ef9297713a5" containerID="62b987df5e524734bb1a87893bcee21929a0dae0ecb9b519e2f7ff306880feda" exitCode=0 Apr 16 18:34:05.952616 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:05.952565 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" event={"ID":"0f78be17-5c3e-439a-8923-6ef9297713a5","Type":"ContainerDied","Data":"62b987df5e524734bb1a87893bcee21929a0dae0ecb9b519e2f7ff306880feda"} Apr 16 18:34:05.953711 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:05.953689 2577 scope.go:117] "RemoveContainer" containerID="62b987df5e524734bb1a87893bcee21929a0dae0ecb9b519e2f7ff306880feda" Apr 16 18:34:06.956924 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:06.956886 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-p9g8c" event={"ID":"0f78be17-5c3e-439a-8923-6ef9297713a5","Type":"ContainerStarted","Data":"2c95a443786d4d905191dbbf82d281eedc4a130413911053f5d7ada5af56e67a"} Apr 16 18:34:06.958578 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:06.958554 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-crf8r" event={"ID":"ed5ef1cb-7b4d-4a71-a177-306862891c7a","Type":"ContainerStarted","Data":"4a1cfe40f59f8e998c1fac9339f816b1cbf281a8273fa5ac633ff39a0278e5b1"} Apr 16 18:34:07.594761 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:07.594731 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/init-config-reloader/0.log" Apr 16 18:34:07.794942 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:07.794916 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/alertmanager/0.log" Apr 16 18:34:07.994649 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:07.994560 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/config-reloader/0.log" Apr 16 18:34:08.194695 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:08.194665 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/kube-rbac-proxy-web/0.log" Apr 16 18:34:08.394844 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:08.394807 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/kube-rbac-proxy/0.log" Apr 16 18:34:08.595401 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:08.595375 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/kube-rbac-proxy-metric/0.log" Apr 16 18:34:08.794975 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:08.794950 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/prom-label-proxy/0.log" Apr 16 18:34:10.195709 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:10.195685 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2ctl8_0efab944-8c28-42de-a3d8-4656f60b4334/init-textfile/0.log" Apr 16 18:34:10.395646 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:10.395595 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2ctl8_0efab944-8c28-42de-a3d8-4656f60b4334/node-exporter/0.log" Apr 16 18:34:10.595643 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:10.595600 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2ctl8_0efab944-8c28-42de-a3d8-4656f60b4334/kube-rbac-proxy/0.log" Apr 16 18:34:11.994356 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:11.994273 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4hnkb_fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2/kube-rbac-proxy-main/0.log" Apr 16 18:34:12.194475 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:12.194445 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4hnkb_fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2/kube-rbac-proxy-self/0.log" Apr 16 18:34:12.398564 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:12.398539 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4hnkb_fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2/openshift-state-metrics/0.log" Apr 16 18:34:12.594120 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:12.594093 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51371cc3-9f06-48bb-90f9-389b162a65b4/init-config-reloader/0.log" Apr 16 18:34:12.795837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:12.795791 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51371cc3-9f06-48bb-90f9-389b162a65b4/prometheus/0.log" Apr 16 18:34:12.996320 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:12.996275 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51371cc3-9f06-48bb-90f9-389b162a65b4/config-reloader/0.log" Apr 16 18:34:13.195533 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:13.195418 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51371cc3-9f06-48bb-90f9-389b162a65b4/thanos-sidecar/0.log" Apr 16 18:34:13.394735 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:13.394709 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51371cc3-9f06-48bb-90f9-389b162a65b4/kube-rbac-proxy-web/0.log" Apr 16 18:34:13.595281 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:13.595226 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51371cc3-9f06-48bb-90f9-389b162a65b4/kube-rbac-proxy/0.log" Apr 16 18:34:13.794189 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:13.794159 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_51371cc3-9f06-48bb-90f9-389b162a65b4/kube-rbac-proxy-thanos/0.log" Apr 16 18:34:14.594174 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:14.594144 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/thanos-query/0.log" Apr 16 18:34:14.794317 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:14.794282 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy-web/0.log" Apr 16 18:34:14.993877 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:14.993757 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy/0.log" Apr 16 18:34:15.194234 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:15.194204 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/prom-label-proxy/0.log" Apr 16 18:34:15.393850 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:15.393823 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy-rules/0.log" Apr 16 18:34:15.595567 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:15.595537 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy-metrics/0.log" Apr 16 18:34:15.994027 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:15.994003 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:34:16.196479 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:16.196457 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/3.log" Apr 16 18:34:16.803739 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:16.803708 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-z4whn_584eb437-56ab-4653-bf73-77111eabe484/download-server/0.log" Apr 16 18:34:16.994623 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:16.994592 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-665bf87cd4-2mxqq_eda65526-e0cd-496b-8e27-579365e644c6/router/0.log" Apr 16 18:34:17.194697 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:17.194626 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-f9b8k_84c80ef6-0ecb-44ac-9d4b-c6004661b2f5/serve-healthcheck-canary/0.log" Apr 16 18:34:20.863256 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:20.863225 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:20.887357 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:20.887074 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:21.028329 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:21.028302 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:31.973542 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:31.973491 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:34:31.976003 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:31.975979 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaeaee38-a562-498e-b7ec-505134c92159-metrics-certs\") pod \"network-metrics-daemon-lvnd8\" (UID: \"aaeaee38-a562-498e-b7ec-505134c92159\") " pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:34:32.211733 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:32.211697 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-kxnrw\"" Apr 16 18:34:32.218931 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:32.218906 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lvnd8" Apr 16 18:34:32.542416 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:32.542394 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lvnd8"] Apr 16 18:34:32.544982 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:34:32.544954 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaaeaee38_a562_498e_b7ec_505134c92159.slice/crio-4a5d7ea6c6508a34535c7f8afa61085ff0e1338636a9e315acdd99dc1618d02f WatchSource:0}: Error finding container 4a5d7ea6c6508a34535c7f8afa61085ff0e1338636a9e315acdd99dc1618d02f: Status 404 returned error can't find the container with id 4a5d7ea6c6508a34535c7f8afa61085ff0e1338636a9e315acdd99dc1618d02f Apr 16 18:34:33.048125 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:33.048083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lvnd8" event={"ID":"aaeaee38-a562-498e-b7ec-505134c92159","Type":"ContainerStarted","Data":"4a5d7ea6c6508a34535c7f8afa61085ff0e1338636a9e315acdd99dc1618d02f"} Apr 16 18:34:34.053059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.053025 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lvnd8" event={"ID":"aaeaee38-a562-498e-b7ec-505134c92159","Type":"ContainerStarted","Data":"a401991efd78e6ee2f02570d2d16d804fff090a44f70c47409037d8a6f32a95e"} Apr 16 18:34:34.053059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.053061 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lvnd8" event={"ID":"aaeaee38-a562-498e-b7ec-505134c92159","Type":"ContainerStarted","Data":"147e5a3b3f27a4ca34a514a2445091a93127b62feb9880a1226541766d4d5b1e"} Apr 16 18:34:34.077256 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.077200 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lvnd8" podStartSLOduration=253.144387693 podStartE2EDuration="4m14.077184013s" podCreationTimestamp="2026-04-16 18:30:20 +0000 UTC" firstStartedPulling="2026-04-16 18:34:32.546938603 +0000 UTC m=+252.931258903" lastFinishedPulling="2026-04-16 18:34:33.479734921 +0000 UTC m=+253.864055223" observedRunningTime="2026-04-16 18:34:34.076194878 +0000 UTC m=+254.460515198" watchObservedRunningTime="2026-04-16 18:34:34.077184013 +0000 UTC m=+254.461504334" Apr 16 18:34:34.600306 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.600270 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:34.600736 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.600713 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="alertmanager" containerID="cri-o://bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8" gracePeriod=120 Apr 16 18:34:34.600801 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.600763 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-metric" containerID="cri-o://a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68" gracePeriod=120 Apr 16 18:34:34.600905 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.600795 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-web" containerID="cri-o://15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad" gracePeriod=120 Apr 16 18:34:34.600905 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.600807 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="config-reloader" containerID="cri-o://2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde" gracePeriod=120 Apr 16 18:34:34.600905 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.600899 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="prom-label-proxy" containerID="cri-o://72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde" gracePeriod=120 Apr 16 18:34:34.601059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:34.600887 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy" containerID="cri-o://551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4" gracePeriod=120 Apr 16 18:34:35.059534 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059505 2577 generic.go:358] "Generic (PLEG): container finished" podID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerID="72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde" exitCode=0 Apr 16 18:34:35.059534 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059529 2577 generic.go:358] "Generic (PLEG): container finished" podID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerID="a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68" exitCode=0 Apr 16 18:34:35.059534 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059535 2577 generic.go:358] "Generic (PLEG): container finished" podID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerID="551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4" exitCode=0 Apr 16 18:34:35.059534 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059540 2577 generic.go:358] "Generic (PLEG): container finished" podID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerID="2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde" exitCode=0 Apr 16 18:34:35.059534 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059545 2577 generic.go:358] "Generic (PLEG): container finished" podID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerID="bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8" exitCode=0 Apr 16 18:34:35.060059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059575 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde"} Apr 16 18:34:35.060059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059614 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68"} Apr 16 18:34:35.060059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059628 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4"} Apr 16 18:34:35.060059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059644 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde"} Apr 16 18:34:35.060059 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.059657 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8"} Apr 16 18:34:35.851910 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.851889 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:35.911177 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911150 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-main-tls\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911330 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911196 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911330 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911226 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911423 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911348 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-web\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911423 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911408 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-volume\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911529 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911503 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-main-db\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911592 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911564 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-cluster-tls-config\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911603 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-web-config\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911600 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:35.911648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911629 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911799 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911663 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-metrics-client-ca\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911799 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911707 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-tls-assets\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911799 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911750 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-out\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.911799 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911789 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4q7v\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-kube-api-access-r4q7v\") pod \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\" (UID: \"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e\") " Apr 16 18:34:35.912016 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.911841 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:35.912108 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.912087 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:35.912164 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.912115 2577 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-alertmanager-main-db\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:35.912164 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.912120 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:35.914000 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.913968 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-volume" (OuterVolumeSpecName: "config-volume") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:35.914610 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.914580 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:35.915462 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.915380 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:35.915666 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.915623 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:35.915666 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.915623 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-kube-api-access-r4q7v" (OuterVolumeSpecName: "kube-api-access-r4q7v") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "kube-api-access-r4q7v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:35.916167 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.916101 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:35.916167 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.916125 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:35.916708 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.916685 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-out" (OuterVolumeSpecName: "config-out") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:35.920041 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.920017 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:35.926825 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:35.926804 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-web-config" (OuterVolumeSpecName: "web-config") pod "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" (UID: "80ecd6f8-0bb4-4f55-a4e1-6714f26d293e"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:36.012648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012611 2577 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-cluster-tls-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012643 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-web-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012659 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012674 2577 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-metrics-client-ca\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012687 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-tls-assets\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012698 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-out\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012709 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4q7v\" (UniqueName: \"kubernetes.io/projected/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-kube-api-access-r4q7v\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012722 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-main-tls\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012736 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012749 2577 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.012898 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.012762 2577 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e-config-volume\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:36.066282 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.066253 2577 generic.go:358] "Generic (PLEG): container finished" podID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerID="15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad" exitCode=0 Apr 16 18:34:36.066679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.066300 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad"} Apr 16 18:34:36.066679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.066333 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"80ecd6f8-0bb4-4f55-a4e1-6714f26d293e","Type":"ContainerDied","Data":"2633c9227fd9e60bd73fb6aef8bbde9256c3e0364471f2343c5b52e2bb779e2e"} Apr 16 18:34:36.066679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.066353 2577 scope.go:117] "RemoveContainer" containerID="72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde" Apr 16 18:34:36.066679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.066362 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.073809 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.073748 2577 scope.go:117] "RemoveContainer" containerID="a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68" Apr 16 18:34:36.081310 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.081289 2577 scope.go:117] "RemoveContainer" containerID="551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4" Apr 16 18:34:36.090164 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.090136 2577 scope.go:117] "RemoveContainer" containerID="15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad" Apr 16 18:34:36.090469 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.090451 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:36.094047 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.094026 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:36.097558 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.097538 2577 scope.go:117] "RemoveContainer" containerID="2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde" Apr 16 18:34:36.103850 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.103834 2577 scope.go:117] "RemoveContainer" containerID="bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8" Apr 16 18:34:36.110404 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.110389 2577 scope.go:117] "RemoveContainer" containerID="1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758" Apr 16 18:34:36.116499 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.116484 2577 scope.go:117] "RemoveContainer" containerID="72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde" Apr 16 18:34:36.116925 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:34:36.116766 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde\": container with ID starting with 72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde not found: ID does not exist" containerID="72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde" Apr 16 18:34:36.116925 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.116802 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde"} err="failed to get container status \"72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde\": rpc error: code = NotFound desc = could not find container \"72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde\": container with ID starting with 72393f7af0e0cc98e2b10befe49239c02354c79de902fa5a7f3209461a646bde not found: ID does not exist" Apr 16 18:34:36.116925 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.116843 2577 scope.go:117] "RemoveContainer" containerID="a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68" Apr 16 18:34:36.117435 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:34:36.117325 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68\": container with ID starting with a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68 not found: ID does not exist" containerID="a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68" Apr 16 18:34:36.117435 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.117353 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68"} err="failed to get container status \"a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68\": rpc error: code = NotFound desc = could not find container \"a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68\": container with ID starting with a5e6ca977229571a5d13be0b869f0ba2306ce36641f003d0cfa984d935ba6e68 not found: ID does not exist" Apr 16 18:34:36.117435 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.117375 2577 scope.go:117] "RemoveContainer" containerID="551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4" Apr 16 18:34:36.117919 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:34:36.117801 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4\": container with ID starting with 551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4 not found: ID does not exist" containerID="551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4" Apr 16 18:34:36.117919 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.117826 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4"} err="failed to get container status \"551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4\": rpc error: code = NotFound desc = could not find container \"551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4\": container with ID starting with 551da77f38f23b7c738fa57b1881c32a976e52c53204661144af3570ccd31fe4 not found: ID does not exist" Apr 16 18:34:36.117919 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.117846 2577 scope.go:117] "RemoveContainer" containerID="15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad" Apr 16 18:34:36.118423 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:34:36.118399 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad\": container with ID starting with 15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad not found: ID does not exist" containerID="15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad" Apr 16 18:34:36.118529 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.118451 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad"} err="failed to get container status \"15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad\": rpc error: code = NotFound desc = could not find container \"15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad\": container with ID starting with 15fb4ed36babcb8d57484f5e6984e25f23a750092c8f02b7772c26b9925f3bad not found: ID does not exist" Apr 16 18:34:36.118529 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.118475 2577 scope.go:117] "RemoveContainer" containerID="2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde" Apr 16 18:34:36.118791 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:34:36.118765 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde\": container with ID starting with 2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde not found: ID does not exist" containerID="2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde" Apr 16 18:34:36.118904 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.118798 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde"} err="failed to get container status \"2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde\": rpc error: code = NotFound desc = could not find container \"2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde\": container with ID starting with 2e865d1a22209422daf4fba8b1d66b645af4543823d0ba5eded43aac324f5fde not found: ID does not exist" Apr 16 18:34:36.118904 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.118820 2577 scope.go:117] "RemoveContainer" containerID="bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8" Apr 16 18:34:36.119167 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:34:36.119147 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8\": container with ID starting with bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8 not found: ID does not exist" containerID="bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8" Apr 16 18:34:36.119238 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119173 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8"} err="failed to get container status \"bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8\": rpc error: code = NotFound desc = could not find container \"bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8\": container with ID starting with bd3830dbd534bc8ee7d7e3fd80b2301ef1f42f30540a3f37077be28ff6ee5bb8 not found: ID does not exist" Apr 16 18:34:36.119238 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119192 2577 scope.go:117] "RemoveContainer" containerID="1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758" Apr 16 18:34:36.119342 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119262 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:36.119437 ip-10-0-139-49 kubenswrapper[2577]: E0416 18:34:36.119417 2577 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758\": container with ID starting with 1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758 not found: ID does not exist" containerID="1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758" Apr 16 18:34:36.119491 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119442 2577 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758"} err="failed to get container status \"1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758\": rpc error: code = NotFound desc = could not find container \"1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758\": container with ID starting with 1373b6c17f2bd43e05ec0865bc6d651a5128e7bca9171712169d7ecf7cb57758 not found: ID does not exist" Apr 16 18:34:36.119646 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119630 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="config-reloader" Apr 16 18:34:36.119698 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119649 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="config-reloader" Apr 16 18:34:36.119698 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119661 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-metric" Apr 16 18:34:36.119698 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119670 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-metric" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119697 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-web" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119707 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-web" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119723 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119732 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119750 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="alertmanager" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119757 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="alertmanager" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119771 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="prom-label-proxy" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119780 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="prom-label-proxy" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119791 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="init-config-reloader" Apr 16 18:34:36.119837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119799 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="init-config-reloader" Apr 16 18:34:36.120458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119882 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-web" Apr 16 18:34:36.120458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119895 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy" Apr 16 18:34:36.120458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119907 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="kube-rbac-proxy-metric" Apr 16 18:34:36.120458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119917 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="prom-label-proxy" Apr 16 18:34:36.120458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119929 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="config-reloader" Apr 16 18:34:36.120458 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.119939 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" containerName="alertmanager" Apr 16 18:34:36.125003 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.124986 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.129417 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.129393 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 18:34:36.129689 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.129670 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 18:34:36.129777 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.129741 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 18:34:36.130047 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.130027 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 18:34:36.130132 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.130097 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-ckbh2\"" Apr 16 18:34:36.130190 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.130167 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 18:34:36.130244 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.129674 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 18:34:36.130244 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.130038 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 18:34:36.130633 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.130612 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 18:34:36.133845 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.133827 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 18:34:36.137053 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.137031 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:36.213055 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.212978 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ecd6f8-0bb4-4f55-a4e1-6714f26d293e" path="/var/lib/kubelet/pods/80ecd6f8-0bb4-4f55-a4e1-6714f26d293e/volumes" Apr 16 18:34:36.214190 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214170 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214245 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214209 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf2485b-0a3f-4640-a088-7c0abf43b616-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214306 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214244 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214369 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214349 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fbf2485b-0a3f-4640-a088-7c0abf43b616-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214420 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214400 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbf2485b-0a3f-4640-a088-7c0abf43b616-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214468 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214434 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214468 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214460 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-web-config\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214491 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf2485b-0a3f-4640-a088-7c0abf43b616-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214580 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214549 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214581 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmmsj\" (UniqueName: \"kubernetes.io/projected/fbf2485b-0a3f-4640-a088-7c0abf43b616-kube-api-access-pmmsj\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214619 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214645 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbf2485b-0a3f-4640-a088-7c0abf43b616-config-out\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.214679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.214671 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-config-volume\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315119 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315085 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315272 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315128 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf2485b-0a3f-4640-a088-7c0abf43b616-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315272 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315187 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315272 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315211 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fbf2485b-0a3f-4640-a088-7c0abf43b616-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315437 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315414 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbf2485b-0a3f-4640-a088-7c0abf43b616-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315490 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315450 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315490 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315467 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-web-config\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315737 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315707 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf2485b-0a3f-4640-a088-7c0abf43b616-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315788 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315899 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315842 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pmmsj\" (UniqueName: \"kubernetes.io/projected/fbf2485b-0a3f-4640-a088-7c0abf43b616-kube-api-access-pmmsj\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.315951 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315929 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.316007 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315963 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/fbf2485b-0a3f-4640-a088-7c0abf43b616-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.316007 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.315977 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbf2485b-0a3f-4640-a088-7c0abf43b616-config-out\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.316152 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.316019 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-config-volume\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.316382 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.316357 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbf2485b-0a3f-4640-a088-7c0abf43b616-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.317447 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.317418 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbf2485b-0a3f-4640-a088-7c0abf43b616-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.318316 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.318273 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.318316 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.318290 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-web-config\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.318585 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.318562 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.319313 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.319040 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbf2485b-0a3f-4640-a088-7c0abf43b616-tls-assets\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.319313 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.319264 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-config-volume\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.319733 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.319710 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.319880 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.319844 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.320205 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.320185 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbf2485b-0a3f-4640-a088-7c0abf43b616-config-out\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.320536 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.320520 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/fbf2485b-0a3f-4640-a088-7c0abf43b616-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.325189 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.325157 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmmsj\" (UniqueName: \"kubernetes.io/projected/fbf2485b-0a3f-4640-a088-7c0abf43b616-kube-api-access-pmmsj\") pod \"alertmanager-main-0\" (UID: \"fbf2485b-0a3f-4640-a088-7c0abf43b616\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.465672 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.465591 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 18:34:36.797588 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:36.797560 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 18:34:36.798239 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:34:36.798214 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbf2485b_0a3f_4640_a088_7c0abf43b616.slice/crio-c573231ba9f42cc3dcad3a90b29c4115c32d35923be7e105a88058ae85d6600a WatchSource:0}: Error finding container c573231ba9f42cc3dcad3a90b29c4115c32d35923be7e105a88058ae85d6600a: Status 404 returned error can't find the container with id c573231ba9f42cc3dcad3a90b29c4115c32d35923be7e105a88058ae85d6600a Apr 16 18:34:37.070465 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:37.070377 2577 generic.go:358] "Generic (PLEG): container finished" podID="fbf2485b-0a3f-4640-a088-7c0abf43b616" containerID="36d51c78a4e9cc82325d0bf792ec90ff0b284e50fd5e7286fb15177935944935" exitCode=0 Apr 16 18:34:37.070902 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:37.070468 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerDied","Data":"36d51c78a4e9cc82325d0bf792ec90ff0b284e50fd5e7286fb15177935944935"} Apr 16 18:34:37.070902 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:37.070517 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerStarted","Data":"c573231ba9f42cc3dcad3a90b29c4115c32d35923be7e105a88058ae85d6600a"} Apr 16 18:34:38.076699 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.076670 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerStarted","Data":"bc6cbcaf632a155870d595d3f8d2e9254c88913853cd6e7a2321e38dd7a5cd63"} Apr 16 18:34:38.076699 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.076703 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerStarted","Data":"80b9166e49867934ab7e43348c97c0f505587fb2244f0ad0c5a1f35d920de4b3"} Apr 16 18:34:38.077083 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.076713 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerStarted","Data":"a9cd545659745249fc80d3472f60f42d5de033a1aa27236ea91f6f4558df07fa"} Apr 16 18:34:38.077083 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.076723 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerStarted","Data":"aaae68a741e9c752d18a3ae0500a059a8c9ac53621cadeac7b0e189f737c865f"} Apr 16 18:34:38.077083 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.076733 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerStarted","Data":"9691ea9ddbda61c69d10df0b5e53774ef3f30dd29ca389a7372b2d3746ef9e1e"} Apr 16 18:34:38.077083 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.076741 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"fbf2485b-0a3f-4640-a088-7c0abf43b616","Type":"ContainerStarted","Data":"0c0815e8c63b1c10b11bc9d5b24c6af3f4c8056d19070987bd3c6e321f7362ef"} Apr 16 18:34:38.105161 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.105120 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.105105857 podStartE2EDuration="2.105105857s" podCreationTimestamp="2026-04-16 18:34:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:38.102329895 +0000 UTC m=+258.486650228" watchObservedRunningTime="2026-04-16 18:34:38.105105857 +0000 UTC m=+258.489426178" Apr 16 18:34:38.912938 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.912905 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:38.913375 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.913337 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="prometheus" containerID="cri-o://09c9d89ec63c443eb9c9682061cd4544c08349d58646cbd8166ceec1c9be7487" gracePeriod=600 Apr 16 18:34:38.913484 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.913343 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy" containerID="cri-o://22bfc34d05c2f823b1f9379f826a90ad91696b30690369fb95fe14e7f71d46a9" gracePeriod=600 Apr 16 18:34:38.913484 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.913401 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-thanos" containerID="cri-o://418399d27ca05e6592dc07704dc5bde1cff301921892a87984dc24d37c29e6c5" gracePeriod=600 Apr 16 18:34:38.913484 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.913445 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="config-reloader" containerID="cri-o://b4b5537582e1c06b6eac70578a12ad3732433a9005e7b2ac8fb5ec4d0f58b765" gracePeriod=600 Apr 16 18:34:38.913620 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.913393 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-web" containerID="cri-o://9ff0c4aaebbd57e393a32eca496f12329bef60ac19baef7b09752d50a825bdca" gracePeriod=600 Apr 16 18:34:38.913620 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:38.913393 2577 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="thanos-sidecar" containerID="cri-o://11adeb782d2e78cfc87e0afd26009e5fc7fa3ec052d337622201be08d0a04501" gracePeriod=600 Apr 16 18:34:39.084441 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084416 2577 generic.go:358] "Generic (PLEG): container finished" podID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerID="418399d27ca05e6592dc07704dc5bde1cff301921892a87984dc24d37c29e6c5" exitCode=0 Apr 16 18:34:39.084441 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084438 2577 generic.go:358] "Generic (PLEG): container finished" podID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerID="22bfc34d05c2f823b1f9379f826a90ad91696b30690369fb95fe14e7f71d46a9" exitCode=0 Apr 16 18:34:39.084441 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084444 2577 generic.go:358] "Generic (PLEG): container finished" podID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerID="11adeb782d2e78cfc87e0afd26009e5fc7fa3ec052d337622201be08d0a04501" exitCode=0 Apr 16 18:34:39.084784 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084450 2577 generic.go:358] "Generic (PLEG): container finished" podID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerID="b4b5537582e1c06b6eac70578a12ad3732433a9005e7b2ac8fb5ec4d0f58b765" exitCode=0 Apr 16 18:34:39.084784 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084457 2577 generic.go:358] "Generic (PLEG): container finished" podID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerID="09c9d89ec63c443eb9c9682061cd4544c08349d58646cbd8166ceec1c9be7487" exitCode=0 Apr 16 18:34:39.084784 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084482 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"418399d27ca05e6592dc07704dc5bde1cff301921892a87984dc24d37c29e6c5"} Apr 16 18:34:39.084784 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084511 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"22bfc34d05c2f823b1f9379f826a90ad91696b30690369fb95fe14e7f71d46a9"} Apr 16 18:34:39.084784 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084521 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"11adeb782d2e78cfc87e0afd26009e5fc7fa3ec052d337622201be08d0a04501"} Apr 16 18:34:39.084784 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084532 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"b4b5537582e1c06b6eac70578a12ad3732433a9005e7b2ac8fb5ec4d0f58b765"} Apr 16 18:34:39.084784 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:39.084540 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"09c9d89ec63c443eb9c9682061cd4544c08349d58646cbd8166ceec1c9be7487"} Apr 16 18:34:40.091003 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.090975 2577 generic.go:358] "Generic (PLEG): container finished" podID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerID="9ff0c4aaebbd57e393a32eca496f12329bef60ac19baef7b09752d50a825bdca" exitCode=0 Apr 16 18:34:40.091323 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.091046 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"9ff0c4aaebbd57e393a32eca496f12329bef60ac19baef7b09752d50a825bdca"} Apr 16 18:34:40.152241 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.152218 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:40.249521 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249453 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-config-out\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249521 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249494 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-trusted-ca-bundle\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249521 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249511 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-config\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249814 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249536 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-serving-certs-ca-bundle\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249814 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249649 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249814 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249700 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-kube-rbac-proxy\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249814 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249729 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-rulefiles-0\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249814 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249774 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfpx\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-kube-api-access-6zfpx\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.249814 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249808 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-tls-assets\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249835 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-thanos-prometheus-http-client-file\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249907 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-grpc-tls\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249943 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-web-config\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249962 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249959 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.249969 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-kubelet-serving-ca-bundle\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250030 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-metrics-client-certs\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250064 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250231 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250110 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-tls\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250691 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250235 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-metrics-client-ca\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250691 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250278 2577 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-db\") pod \"51371cc3-9f06-48bb-90f9-389b162a65b4\" (UID: \"51371cc3-9f06-48bb-90f9-389b162a65b4\") " Apr 16 18:34:40.250691 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250315 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:40.250691 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250554 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.250691 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250575 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.250691 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.250591 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.251618 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.251595 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:40.251736 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.251671 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:40.251736 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.251692 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:34:40.253344 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.253317 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-config" (OuterVolumeSpecName: "config") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.254153 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254118 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.254259 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254154 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-config-out" (OuterVolumeSpecName: "config-out") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:34:40.254324 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254288 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.254591 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254424 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-kube-api-access-6zfpx" (OuterVolumeSpecName: "kube-api-access-6zfpx") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "kube-api-access-6zfpx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:40.254591 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254438 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.254591 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254455 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:34:40.254591 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254472 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.254591 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254567 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.254909 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.254824 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.255111 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.255092 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.264783 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.264758 2577 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-web-config" (OuterVolumeSpecName: "web-config") pod "51371cc3-9f06-48bb-90f9-389b162a65b4" (UID: "51371cc3-9f06-48bb-90f9-389b162a65b4"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:34:40.351574 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351546 2577 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6zfpx\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-kube-api-access-6zfpx\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351574 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351572 2577 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/51371cc3-9f06-48bb-90f9-389b162a65b4-tls-assets\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351582 2577 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351592 2577 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-grpc-tls\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351602 2577 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-web-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351610 2577 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-metrics-client-certs\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351619 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351630 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351639 2577 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-configmap-metrics-client-ca\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351648 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-db\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351656 2577 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/51371cc3-9f06-48bb-90f9-389b162a65b4-config-out\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351665 2577 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-config\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351673 2577 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351681 2577 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/51371cc3-9f06-48bb-90f9-389b162a65b4-secret-kube-rbac-proxy\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:40.351747 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:40.351689 2577 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/51371cc3-9f06-48bb-90f9-389b162a65b4-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-49.ec2.internal\" DevicePath \"\"" Apr 16 18:34:41.096355 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.096325 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"51371cc3-9f06-48bb-90f9-389b162a65b4","Type":"ContainerDied","Data":"607208b0086deaff88fdb846e2da2361cb5e67e12a27392406589f3497dca1dd"} Apr 16 18:34:41.096756 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.096357 2577 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.096756 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.096369 2577 scope.go:117] "RemoveContainer" containerID="418399d27ca05e6592dc07704dc5bde1cff301921892a87984dc24d37c29e6c5" Apr 16 18:34:41.104954 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.104937 2577 scope.go:117] "RemoveContainer" containerID="22bfc34d05c2f823b1f9379f826a90ad91696b30690369fb95fe14e7f71d46a9" Apr 16 18:34:41.111531 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.111500 2577 scope.go:117] "RemoveContainer" containerID="9ff0c4aaebbd57e393a32eca496f12329bef60ac19baef7b09752d50a825bdca" Apr 16 18:34:41.117753 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.117733 2577 scope.go:117] "RemoveContainer" containerID="11adeb782d2e78cfc87e0afd26009e5fc7fa3ec052d337622201be08d0a04501" Apr 16 18:34:41.120257 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.120234 2577 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:41.124173 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.124151 2577 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:41.125311 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.125289 2577 scope.go:117] "RemoveContainer" containerID="b4b5537582e1c06b6eac70578a12ad3732433a9005e7b2ac8fb5ec4d0f58b765" Apr 16 18:34:41.131728 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.131713 2577 scope.go:117] "RemoveContainer" containerID="09c9d89ec63c443eb9c9682061cd4544c08349d58646cbd8166ceec1c9be7487" Apr 16 18:34:41.138591 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.138571 2577 scope.go:117] "RemoveContainer" containerID="8f6aa730fcc5ebf1d3dad4006d9099c683af08afff3dd831f8870038f67f0864" Apr 16 18:34:41.149057 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149037 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:41.149335 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149324 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="prometheus" Apr 16 18:34:41.149374 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149337 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="prometheus" Apr 16 18:34:41.149374 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149344 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-web" Apr 16 18:34:41.149374 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149350 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-web" Apr 16 18:34:41.149374 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149357 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy" Apr 16 18:34:41.149374 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149362 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy" Apr 16 18:34:41.149374 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149370 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="init-config-reloader" Apr 16 18:34:41.149374 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149375 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="init-config-reloader" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149383 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="thanos-sidecar" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149388 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="thanos-sidecar" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149394 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="config-reloader" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149399 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="config-reloader" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149417 2577 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-thanos" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149422 2577 state_mem.go:107] "Deleted CPUSet assignment" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-thanos" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149466 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="prometheus" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149475 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="config-reloader" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149482 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149489 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-thanos" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149497 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="thanos-sidecar" Apr 16 18:34:41.149598 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.149503 2577 memory_manager.go:356] "RemoveStaleState removing state" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" containerName="kube-rbac-proxy-web" Apr 16 18:34:41.154452 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.154438 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.157463 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157437 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:34:41.157463 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157451 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:34:41.157463 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157454 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:34:41.157667 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157455 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-dqvpv\"" Apr 16 18:34:41.157667 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157582 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5cjtgvcnbsp9v\"" Apr 16 18:34:41.157824 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157811 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:34:41.157888 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157848 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:34:41.157995 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157980 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:34:41.157995 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.157993 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:34:41.158134 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.158094 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:34:41.158356 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.158342 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:34:41.158398 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.158348 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:34:41.158839 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.158823 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:34:41.166642 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.166318 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:34:41.169701 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.169682 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:41.169842 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.169823 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:34:41.257717 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257690 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257825 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257723 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2da4cca6-8efa-4ff8-8739-2be46c0433fd-config-out\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257825 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257742 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257825 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257765 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257825 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257806 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257826 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257843 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-config\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257888 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwq8v\" (UniqueName: \"kubernetes.io/projected/2da4cca6-8efa-4ff8-8739-2be46c0433fd-kube-api-access-gwq8v\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257909 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257923 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257941 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-web-config\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257958 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2da4cca6-8efa-4ff8-8739-2be46c0433fd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.257978 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.257972 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.258205 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.258024 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.258205 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.258075 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.258205 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.258090 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.258205 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.258108 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.258205 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.258132 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.358679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358611 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2da4cca6-8efa-4ff8-8739-2be46c0433fd-config-out\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.358679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358652 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.358679 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358670 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.358837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358720 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.358837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358747 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.358837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358770 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-config\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.358837 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358817 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwq8v\" (UniqueName: \"kubernetes.io/projected/2da4cca6-8efa-4ff8-8739-2be46c0433fd-kube-api-access-gwq8v\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358851 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358912 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358941 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-web-config\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358973 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2da4cca6-8efa-4ff8-8739-2be46c0433fd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.358996 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359050 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359022 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359400 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359061 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359400 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359087 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359400 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359118 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359400 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359176 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.359400 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359233 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.360549 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359728 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.360549 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359796 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.360549 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.359841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.360549 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.360125 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.360549 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.360253 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.361847 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.361407 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2da4cca6-8efa-4ff8-8739-2be46c0433fd-config-out\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.361847 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.361669 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.362012 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.361958 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.362070 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.362047 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2da4cca6-8efa-4ff8-8739-2be46c0433fd-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.362783 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.362719 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.363183 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.363161 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.363518 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.363497 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.363741 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.363717 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-config\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.364014 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.363995 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.364014 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.364006 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.364428 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.364413 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2da4cca6-8efa-4ff8-8739-2be46c0433fd-web-config\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.365151 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.365132 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/2da4cca6-8efa-4ff8-8739-2be46c0433fd-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.366791 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.366775 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwq8v\" (UniqueName: \"kubernetes.io/projected/2da4cca6-8efa-4ff8-8739-2be46c0433fd-kube-api-access-gwq8v\") pod \"prometheus-k8s-0\" (UID: \"2da4cca6-8efa-4ff8-8739-2be46c0433fd\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.468550 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.468501 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:34:41.594541 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:41.594493 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:34:41.596647 ip-10-0-139-49 kubenswrapper[2577]: W0416 18:34:41.596619 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2da4cca6_8efa_4ff8_8739_2be46c0433fd.slice/crio-3f82d3e034d8a0c5501af6a262a745860c521136ab73a36a73658a8be92c6480 WatchSource:0}: Error finding container 3f82d3e034d8a0c5501af6a262a745860c521136ab73a36a73658a8be92c6480: Status 404 returned error can't find the container with id 3f82d3e034d8a0c5501af6a262a745860c521136ab73a36a73658a8be92c6480 Apr 16 18:34:42.101662 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:42.101632 2577 generic.go:358] "Generic (PLEG): container finished" podID="2da4cca6-8efa-4ff8-8739-2be46c0433fd" containerID="d916f491206bf5b58afd2bdbf884f28b37c026aa17ce9f1094342a17538ffaf6" exitCode=0 Apr 16 18:34:42.102036 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:42.101697 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerDied","Data":"d916f491206bf5b58afd2bdbf884f28b37c026aa17ce9f1094342a17538ffaf6"} Apr 16 18:34:42.102036 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:42.101716 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerStarted","Data":"3f82d3e034d8a0c5501af6a262a745860c521136ab73a36a73658a8be92c6480"} Apr 16 18:34:42.213361 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:42.213330 2577 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51371cc3-9f06-48bb-90f9-389b162a65b4" path="/var/lib/kubelet/pods/51371cc3-9f06-48bb-90f9-389b162a65b4/volumes" Apr 16 18:34:43.108937 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:43.108904 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerStarted","Data":"12350d3a42525f80f44326b9119355a6b57673985d5f80e4d03caa6bed4a7f1c"} Apr 16 18:34:43.108937 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:43.108941 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerStarted","Data":"ed04354706ce8350e5f9fe952fe7ca4bda7ed274bc681bb0169e7580b99eff77"} Apr 16 18:34:43.109337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:43.108951 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerStarted","Data":"06123ba8c51e785eeded77ba7e2b473f2c8a911b70b835559b74fc8242313447"} Apr 16 18:34:43.109337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:43.108960 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerStarted","Data":"cd4dea420d36a21b176ed53549247e823e4240fd874ce14d63c33d917a71d83e"} Apr 16 18:34:43.109337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:43.108970 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerStarted","Data":"1c1513ef3206ae2ea827480e63da20de1532d029bc1d095ecc4856f9005a3dff"} Apr 16 18:34:43.109337 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:43.108978 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"2da4cca6-8efa-4ff8-8739-2be46c0433fd","Type":"ContainerStarted","Data":"100f14637d6cb67408c9151d948c2ae48bce4c07519e556d3c68cd465ee70465"} Apr 16 18:34:43.136762 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:43.136708 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.136687955 podStartE2EDuration="2.136687955s" podCreationTimestamp="2026-04-16 18:34:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:34:43.134583074 +0000 UTC m=+263.518903412" watchObservedRunningTime="2026-04-16 18:34:43.136687955 +0000 UTC m=+263.521008279" Apr 16 18:34:46.468845 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:34:46.468806 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:35:20.093449 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:20.093422 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:35:20.093953 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:20.093425 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:35:20.099730 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:20.099708 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:35:20.100057 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:20.100037 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:35:20.104147 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:20.104133 2577 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:35:41.468796 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:41.468758 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:35:41.484683 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:41.484660 2577 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:35:42.300304 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:35:42.300277 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:40:20.117660 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:40:20.117629 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:40:20.118550 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:40:20.118525 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:40:20.123025 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:40:20.123008 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:40:20.123769 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:40:20.123749 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:45:20.139952 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:45:20.139881 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:45:20.145446 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:45:20.145424 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:45:20.149297 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:45:20.149253 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:45:20.155310 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:45:20.155288 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:50:20.165345 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:50:20.165313 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:50:20.170801 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:50:20.170782 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:50:20.174464 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:50:20.174448 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:50:20.179678 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:50:20.179663 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:55:20.188187 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:55:20.188160 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:55:20.196648 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:55:20.196625 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 18:55:20.200771 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:55:20.200751 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 18:55:20.205551 ip-10-0-139-49 kubenswrapper[2577]: I0416 18:55:20.205533 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:00:20.214505 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:00:20.214468 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:00:20.219525 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:00:20.219503 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:00:20.223635 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:00:20.223618 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:00:20.228479 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:00:20.228461 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:05:20.235358 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:05:20.235329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:05:20.240574 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:05:20.240548 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:05:20.248566 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:05:20.248548 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:05:20.253450 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:05:20.253433 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:10:20.260595 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:10:20.260559 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:10:20.266271 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:10:20.266247 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:10:20.270997 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:10:20.270978 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:10:20.275654 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:10:20.275637 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:15:20.284141 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:15:20.284062 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:15:20.290259 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:15:20.290237 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:15:20.294024 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:15:20.294007 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:15:20.298686 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:15:20.298669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:20:20.308144 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:20:20.308043 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:20:20.313693 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:20:20.313671 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:20:20.316450 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:20:20.316431 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:20:20.321323 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:20:20.321306 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:25:20.331177 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:25:20.331047 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:25:20.337027 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:25:20.337006 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:25:20.339771 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:25:20.339753 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:25:20.344414 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:25:20.344399 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:30:20.354052 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:30:20.353934 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:30:20.360169 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:30:20.359588 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:30:20.361889 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:30:20.361850 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:30:20.366723 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:30:20.366708 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:35:19.663461 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.663421 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dncfj/must-gather-vrbcb"] Apr 16 19:35:19.668334 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.668309 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:19.671591 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.671565 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dncfj\"/\"kube-root-ca.crt\"" Apr 16 19:35:19.671730 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.671565 2577 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-dncfj\"/\"openshift-service-ca.crt\"" Apr 16 19:35:19.671730 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.671641 2577 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-dncfj\"/\"default-dockercfg-4wqkq\"" Apr 16 19:35:19.674999 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.674977 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dncfj/must-gather-vrbcb"] Apr 16 19:35:19.738280 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.738248 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c332f73-03e6-4c9f-9623-ce056005c6a8-must-gather-output\") pod \"must-gather-vrbcb\" (UID: \"1c332f73-03e6-4c9f-9623-ce056005c6a8\") " pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:19.738437 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.738368 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nptt\" (UniqueName: \"kubernetes.io/projected/1c332f73-03e6-4c9f-9623-ce056005c6a8-kube-api-access-7nptt\") pod \"must-gather-vrbcb\" (UID: \"1c332f73-03e6-4c9f-9623-ce056005c6a8\") " pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:19.839537 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.839502 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7nptt\" (UniqueName: \"kubernetes.io/projected/1c332f73-03e6-4c9f-9623-ce056005c6a8-kube-api-access-7nptt\") pod \"must-gather-vrbcb\" (UID: \"1c332f73-03e6-4c9f-9623-ce056005c6a8\") " pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:19.839709 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.839553 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c332f73-03e6-4c9f-9623-ce056005c6a8-must-gather-output\") pod \"must-gather-vrbcb\" (UID: \"1c332f73-03e6-4c9f-9623-ce056005c6a8\") " pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:19.839885 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.839841 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/1c332f73-03e6-4c9f-9623-ce056005c6a8-must-gather-output\") pod \"must-gather-vrbcb\" (UID: \"1c332f73-03e6-4c9f-9623-ce056005c6a8\") " pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:19.847729 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.847707 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nptt\" (UniqueName: \"kubernetes.io/projected/1c332f73-03e6-4c9f-9623-ce056005c6a8-kube-api-access-7nptt\") pod \"must-gather-vrbcb\" (UID: \"1c332f73-03e6-4c9f-9623-ce056005c6a8\") " pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:19.987788 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:19.987711 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dncfj/must-gather-vrbcb" Apr 16 19:35:20.106570 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:20.106546 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dncfj/must-gather-vrbcb"] Apr 16 19:35:20.109023 ip-10-0-139-49 kubenswrapper[2577]: W0416 19:35:20.108997 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c332f73_03e6_4c9f_9623_ce056005c6a8.slice/crio-e9900eb7a76656e0b0d24d3507275721e7598e3ca2cd34bba71d5c8c024910a1 WatchSource:0}: Error finding container e9900eb7a76656e0b0d24d3507275721e7598e3ca2cd34bba71d5c8c024910a1: Status 404 returned error can't find the container with id e9900eb7a76656e0b0d24d3507275721e7598e3ca2cd34bba71d5c8c024910a1 Apr 16 19:35:20.110798 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:20.110782 2577 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:35:20.376668 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:20.376563 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:35:20.389696 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:20.382567 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:35:20.389696 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:20.384906 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:35:20.390080 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:20.390064 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:35:20.720628 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:20.720551 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dncfj/must-gather-vrbcb" event={"ID":"1c332f73-03e6-4c9f-9623-ce056005c6a8","Type":"ContainerStarted","Data":"e9900eb7a76656e0b0d24d3507275721e7598e3ca2cd34bba71d5c8c024910a1"} Apr 16 19:35:21.727129 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:21.727083 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dncfj/must-gather-vrbcb" event={"ID":"1c332f73-03e6-4c9f-9623-ce056005c6a8","Type":"ContainerStarted","Data":"c04c27c6108207c277375cada78685e91d9372959a91f79745306beeece8b959"} Apr 16 19:35:21.727129 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:21.727135 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dncfj/must-gather-vrbcb" event={"ID":"1c332f73-03e6-4c9f-9623-ce056005c6a8","Type":"ContainerStarted","Data":"548aa35cadf73c7584724288b69f411df29563b2cb72c4864921a84c71764177"} Apr 16 19:35:21.743293 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:21.743244 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dncfj/must-gather-vrbcb" podStartSLOduration=2.014611233 podStartE2EDuration="2.743229857s" podCreationTimestamp="2026-04-16 19:35:19 +0000 UTC" firstStartedPulling="2026-04-16 19:35:20.110922872 +0000 UTC m=+3900.495243174" lastFinishedPulling="2026-04-16 19:35:20.839541499 +0000 UTC m=+3901.223861798" observedRunningTime="2026-04-16 19:35:21.742050269 +0000 UTC m=+3902.126370591" watchObservedRunningTime="2026-04-16 19:35:21.743229857 +0000 UTC m=+3902.127550178" Apr 16 19:35:22.301337 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:22.301309 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-gsxc5_9199c2f9-247b-4dc9-9900-9a6c99aac450/global-pull-secret-syncer/0.log" Apr 16 19:35:22.352468 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:22.352441 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-7rh4p_419f0068-d7e6-4c74-bfcf-1f6ab607c8bb/konnectivity-agent/0.log" Apr 16 19:35:22.516819 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:22.516785 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-49.ec2.internal_bb503e0bda4c32012b9b964fc44f2748/haproxy/0.log" Apr 16 19:35:26.044091 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.044056 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fbf2485b-0a3f-4640-a088-7c0abf43b616/alertmanager/0.log" Apr 16 19:35:26.069933 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.069868 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fbf2485b-0a3f-4640-a088-7c0abf43b616/config-reloader/0.log" Apr 16 19:35:26.093393 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.093329 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fbf2485b-0a3f-4640-a088-7c0abf43b616/kube-rbac-proxy-web/0.log" Apr 16 19:35:26.116198 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.116159 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fbf2485b-0a3f-4640-a088-7c0abf43b616/kube-rbac-proxy/0.log" Apr 16 19:35:26.139572 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.139501 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fbf2485b-0a3f-4640-a088-7c0abf43b616/kube-rbac-proxy-metric/0.log" Apr 16 19:35:26.165211 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.165186 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fbf2485b-0a3f-4640-a088-7c0abf43b616/prom-label-proxy/0.log" Apr 16 19:35:26.189837 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.189799 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_fbf2485b-0a3f-4640-a088-7c0abf43b616/init-config-reloader/0.log" Apr 16 19:35:26.414992 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.414920 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2ctl8_0efab944-8c28-42de-a3d8-4656f60b4334/node-exporter/0.log" Apr 16 19:35:26.436321 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.436296 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2ctl8_0efab944-8c28-42de-a3d8-4656f60b4334/kube-rbac-proxy/0.log" Apr 16 19:35:26.461340 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.461315 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2ctl8_0efab944-8c28-42de-a3d8-4656f60b4334/init-textfile/0.log" Apr 16 19:35:26.641699 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.641669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4hnkb_fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2/kube-rbac-proxy-main/0.log" Apr 16 19:35:26.667454 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.667385 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4hnkb_fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2/kube-rbac-proxy-self/0.log" Apr 16 19:35:26.694323 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.694300 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-4hnkb_fc1f1d4b-ac82-4689-8a2d-f0ddfc64d3a2/openshift-state-metrics/0.log" Apr 16 19:35:26.743202 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.743173 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da4cca6-8efa-4ff8-8739-2be46c0433fd/prometheus/0.log" Apr 16 19:35:26.765064 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.765039 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da4cca6-8efa-4ff8-8739-2be46c0433fd/config-reloader/0.log" Apr 16 19:35:26.788103 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.788080 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da4cca6-8efa-4ff8-8739-2be46c0433fd/thanos-sidecar/0.log" Apr 16 19:35:26.811782 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.811756 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da4cca6-8efa-4ff8-8739-2be46c0433fd/kube-rbac-proxy-web/0.log" Apr 16 19:35:26.834755 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.834732 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da4cca6-8efa-4ff8-8739-2be46c0433fd/kube-rbac-proxy/0.log" Apr 16 19:35:26.861270 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.861243 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da4cca6-8efa-4ff8-8739-2be46c0433fd/kube-rbac-proxy-thanos/0.log" Apr 16 19:35:26.885405 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:26.885381 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_2da4cca6-8efa-4ff8-8739-2be46c0433fd/init-config-reloader/0.log" Apr 16 19:35:27.128884 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:27.128840 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/thanos-query/0.log" Apr 16 19:35:27.156374 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:27.156346 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy-web/0.log" Apr 16 19:35:27.184758 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:27.184723 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy/0.log" Apr 16 19:35:27.211301 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:27.211272 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/prom-label-proxy/0.log" Apr 16 19:35:27.246971 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:27.246908 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy-rules/0.log" Apr 16 19:35:27.286106 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:27.286039 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7fd49f5b86-966k5_4c844758-a68e-4383-8692-465ff4d4d319/kube-rbac-proxy-metrics/0.log" Apr 16 19:35:28.765550 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:28.765475 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/2.log" Apr 16 19:35:28.770455 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:28.770430 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-tlltr_d42c587a-978d-479c-a50c-6817173ee86b/console-operator/3.log" Apr 16 19:35:29.187972 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.187943 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-z4whn_584eb437-56ab-4653-bf73-77111eabe484/download-server/0.log" Apr 16 19:35:29.487788 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.487671 2577 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz"] Apr 16 19:35:29.493109 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.493079 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.498644 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.498619 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz"] Apr 16 19:35:29.631174 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.631139 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb5r4\" (UniqueName: \"kubernetes.io/projected/5e5f38d7-815e-40f1-bc24-83a43bdaa476-kube-api-access-pb5r4\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.631338 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.631179 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-lib-modules\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.631338 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.631243 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-proc\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.631338 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.631266 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-podres\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.631451 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.631365 2577 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-sys\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.731957 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.731923 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-sys\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732111 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.731982 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb5r4\" (UniqueName: \"kubernetes.io/projected/5e5f38d7-815e-40f1-bc24-83a43bdaa476-kube-api-access-pb5r4\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732111 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.732003 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-lib-modules\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732111 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.732040 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-proc\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732111 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.732039 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-sys\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732248 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.732153 2577 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-podres\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732248 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.732180 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-proc\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732334 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.732322 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-podres\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.732370 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.732356 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5e5f38d7-815e-40f1-bc24-83a43bdaa476-lib-modules\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.740759 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.740700 2577 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb5r4\" (UniqueName: \"kubernetes.io/projected/5e5f38d7-815e-40f1-bc24-83a43bdaa476-kube-api-access-pb5r4\") pod \"perf-node-gather-daemonset-4m7xz\" (UID: \"5e5f38d7-815e-40f1-bc24-83a43bdaa476\") " pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.807321 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.807285 2577 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:29.945083 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:29.945062 2577 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz"] Apr 16 19:35:29.947873 ip-10-0-139-49 kubenswrapper[2577]: W0416 19:35:29.947833 2577 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5e5f38d7_815e_40f1_bc24_83a43bdaa476.slice/crio-f9142bf77ff1f358ecc45765a2f34266db60500cb520bd112a423660dbec3fda WatchSource:0}: Error finding container f9142bf77ff1f358ecc45765a2f34266db60500cb520bd112a423660dbec3fda: Status 404 returned error can't find the container with id f9142bf77ff1f358ecc45765a2f34266db60500cb520bd112a423660dbec3fda Apr 16 19:35:30.378986 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:30.378953 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rp9jr_83f97e63-c527-4a65-9e6d-3bb32971b8d9/dns/0.log" Apr 16 19:35:30.406768 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:30.406743 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-rp9jr_83f97e63-c527-4a65-9e6d-3bb32971b8d9/kube-rbac-proxy/0.log" Apr 16 19:35:30.529699 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:30.529669 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bc429_c8925d01-c41b-4fb9-b191-1828f898b33e/dns-node-resolver/0.log" Apr 16 19:35:30.761471 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:30.761395 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" event={"ID":"5e5f38d7-815e-40f1-bc24-83a43bdaa476","Type":"ContainerStarted","Data":"b80c3dae1b47b0a97e8cc43e03ea07c9a2f4ecf7a0eada3dc47c7c751d4205f2"} Apr 16 19:35:30.761471 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:30.761429 2577 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" event={"ID":"5e5f38d7-815e-40f1-bc24-83a43bdaa476","Type":"ContainerStarted","Data":"f9142bf77ff1f358ecc45765a2f34266db60500cb520bd112a423660dbec3fda"} Apr 16 19:35:30.761652 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:30.761549 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:30.783045 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:30.782989 2577 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" podStartSLOduration=1.78297036 podStartE2EDuration="1.78297036s" podCreationTimestamp="2026-04-16 19:35:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:35:30.779382247 +0000 UTC m=+3911.163702569" watchObservedRunningTime="2026-04-16 19:35:30.78297036 +0000 UTC m=+3911.167290683" Apr 16 19:35:31.009588 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:31.009554 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5fc54bfd4f-w9vfc_69710d6b-726e-4799-ad71-ecd597010acf/registry/0.log" Apr 16 19:35:31.106406 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:31.106380 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-z98p5_ce5b9cf1-84a5-4e49-965d-fca8343e6704/node-ca/0.log" Apr 16 19:35:31.828110 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:31.828079 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-665bf87cd4-2mxqq_eda65526-e0cd-496b-8e27-579365e644c6/router/0.log" Apr 16 19:35:32.162847 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:32.162780 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-f9b8k_84c80ef6-0ecb-44ac-9d4b-c6004661b2f5/serve-healthcheck-canary/0.log" Apr 16 19:35:32.564600 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:32.564570 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-crf8r_ed5ef1cb-7b4d-4a71-a177-306862891c7a/insights-operator/0.log" Apr 16 19:35:32.566901 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:32.566849 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-crf8r_ed5ef1cb-7b4d-4a71-a177-306862891c7a/insights-operator/1.log" Apr 16 19:35:32.665614 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:32.665585 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hnhgh_3fedbc8a-b77d-40da-934a-df24d5a83b14/kube-rbac-proxy/0.log" Apr 16 19:35:32.690233 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:32.690209 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hnhgh_3fedbc8a-b77d-40da-934a-df24d5a83b14/exporter/0.log" Apr 16 19:35:32.712756 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:32.712730 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hnhgh_3fedbc8a-b77d-40da-934a-df24d5a83b14/extractor/0.log" Apr 16 19:35:36.775136 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:36.775105 2577 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-dncfj/perf-node-gather-daemonset-4m7xz" Apr 16 19:35:39.691385 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:39.691356 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-p9g8c_0f78be17-5c3e-439a-8923-6ef9297713a5/kube-storage-version-migrator-operator/1.log" Apr 16 19:35:39.692275 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:39.692255 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-p9g8c_0f78be17-5c3e-439a-8923-6ef9297713a5/kube-storage-version-migrator-operator/0.log" Apr 16 19:35:41.062612 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.062583 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lxp9r_f7e8daeb-176b-4474-94a0-f3d73d0cdf36/kube-multus-additional-cni-plugins/0.log" Apr 16 19:35:41.107360 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.107331 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lxp9r_f7e8daeb-176b-4474-94a0-f3d73d0cdf36/egress-router-binary-copy/0.log" Apr 16 19:35:41.154135 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.154112 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lxp9r_f7e8daeb-176b-4474-94a0-f3d73d0cdf36/cni-plugins/0.log" Apr 16 19:35:41.194417 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.194351 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lxp9r_f7e8daeb-176b-4474-94a0-f3d73d0cdf36/bond-cni-plugin/0.log" Apr 16 19:35:41.235407 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.235376 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lxp9r_f7e8daeb-176b-4474-94a0-f3d73d0cdf36/routeoverride-cni/0.log" Apr 16 19:35:41.273500 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.273472 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lxp9r_f7e8daeb-176b-4474-94a0-f3d73d0cdf36/whereabouts-cni-bincopy/0.log" Apr 16 19:35:41.303117 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.303100 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lxp9r_f7e8daeb-176b-4474-94a0-f3d73d0cdf36/whereabouts-cni/0.log" Apr 16 19:35:41.667873 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.667838 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gp4vl_59c1c7ac-09cf-42a7-8c82-4ce36cbb0ac2/kube-multus/0.log" Apr 16 19:35:41.904006 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.903975 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lvnd8_aaeaee38-a562-498e-b7ec-505134c92159/network-metrics-daemon/0.log" Apr 16 19:35:41.943339 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:41.943276 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-lvnd8_aaeaee38-a562-498e-b7ec-505134c92159/kube-rbac-proxy/0.log" Apr 16 19:35:43.098560 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.098513 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-controller/0.log" Apr 16 19:35:43.122350 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.122324 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/0.log" Apr 16 19:35:43.139948 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.139921 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovn-acl-logging/1.log" Apr 16 19:35:43.164034 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.163976 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/kube-rbac-proxy-node/0.log" Apr 16 19:35:43.188335 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.188307 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:35:43.209514 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.209493 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/northd/0.log" Apr 16 19:35:43.235927 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.235901 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/nbdb/0.log" Apr 16 19:35:43.259447 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.259395 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/sbdb/0.log" Apr 16 19:35:43.374905 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:43.374811 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qg4jj_0054f737-6e94-4be3-a7c1-1bf463f73c6c/ovnkube-controller/0.log" Apr 16 19:35:44.633538 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:44.633512 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-x86k2_8cde21b7-36c1-4558-9fda-d27494f89492/check-endpoints/0.log" Apr 16 19:35:44.720992 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:44.720966 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-z4lx8_62c9ee4c-19ed-4818-b5b8-5f08ec96d7ee/network-check-target-container/0.log" Apr 16 19:35:45.730841 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:45.730813 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-w25sz_b83210b7-868a-4c77-8395-676d64fdd6ce/iptables-alerter/0.log" Apr 16 19:35:46.435084 ip-10-0-139-49 kubenswrapper[2577]: I0416 19:35:46.435053 2577 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-zggh2_911d663f-58e1-4496-843e-77b3b532f155/tuned/0.log"