Apr 22 15:31:41.105582 ip-10-0-130-86 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 15:31:41.105600 ip-10-0-130-86 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 15:31:41.105610 ip-10-0-130-86 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 15:31:41.105934 ip-10-0-130-86 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 15:31:51.344072 ip-10-0-130-86 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 15:31:51.344094 ip-10-0-130-86 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot e61792536fde4a79b43e3975b041ffa4 -- Apr 22 15:34:21.597909 ip-10-0-130-86 systemd[1]: Starting Kubernetes Kubelet... Apr 22 15:34:22.034790 ip-10-0-130-86 kubenswrapper[2534]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:34:22.034790 ip-10-0-130-86 kubenswrapper[2534]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 15:34:22.034790 ip-10-0-130-86 kubenswrapper[2534]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:34:22.034790 ip-10-0-130-86 kubenswrapper[2534]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 15:34:22.034790 ip-10-0-130-86 kubenswrapper[2534]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 15:34:22.037199 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.037098 2534 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 15:34:22.042016 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.041991 2534 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:34:22.042016 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042014 2534 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:34:22.042016 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042019 2534 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042023 2534 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042026 2534 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042029 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042032 2534 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042035 2534 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042048 2534 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042051 2534 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042054 2534 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042057 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042059 2534 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042062 2534 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042065 2534 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042067 2534 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042070 2534 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042072 2534 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042075 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042077 2534 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042079 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042082 2534 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:34:22.042119 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042084 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042087 2534 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042089 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042092 2534 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042095 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042097 2534 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042100 2534 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042102 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042105 2534 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042107 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042110 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042112 2534 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042115 2534 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042117 2534 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042121 2534 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042123 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042126 2534 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042129 2534 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042132 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042150 2534 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:34:22.042666 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042154 2534 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042157 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042161 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042164 2534 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042166 2534 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042169 2534 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042171 2534 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042173 2534 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042176 2534 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042179 2534 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042181 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042184 2534 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042186 2534 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042188 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042191 2534 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042194 2534 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042196 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042199 2534 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042202 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:34:22.043154 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042205 2534 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042209 2534 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042212 2534 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042216 2534 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042219 2534 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042221 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042224 2534 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042228 2534 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042231 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042233 2534 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042236 2534 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042239 2534 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042241 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042249 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042252 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042255 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042257 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042261 2534 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042263 2534 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:34:22.043634 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042266 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042268 2534 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042271 2534 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042274 2534 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042276 2534 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042279 2534 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042728 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042734 2534 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042737 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042739 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042742 2534 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042745 2534 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042748 2534 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042751 2534 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042753 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042756 2534 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042758 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042761 2534 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042763 2534 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:34:22.044110 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042765 2534 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042769 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042771 2534 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042774 2534 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042777 2534 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042780 2534 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042782 2534 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042785 2534 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042788 2534 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042791 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042793 2534 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042796 2534 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042798 2534 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042801 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042803 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042805 2534 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042808 2534 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042810 2534 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042813 2534 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042817 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:34:22.044585 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042820 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042823 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042831 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042834 2534 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042836 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042840 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042842 2534 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042845 2534 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042847 2534 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042850 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042852 2534 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042855 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042857 2534 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042860 2534 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042862 2534 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042865 2534 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042868 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042870 2534 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042873 2534 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042875 2534 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:34:22.045113 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042880 2534 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042883 2534 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042886 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042888 2534 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042891 2534 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042895 2534 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042899 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042902 2534 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042905 2534 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042908 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042910 2534 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042916 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042919 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042921 2534 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042924 2534 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042927 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042929 2534 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042932 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042934 2534 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:34:22.045615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042937 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042939 2534 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042942 2534 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042944 2534 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042947 2534 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042949 2534 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042951 2534 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042954 2534 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042956 2534 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042959 2534 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042961 2534 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042964 2534 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042966 2534 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.042969 2534 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044391 2534 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044405 2534 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044413 2534 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044417 2534 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044422 2534 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044426 2534 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044430 2534 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 15:34:22.046094 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044435 2534 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044439 2534 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044442 2534 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044446 2534 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044449 2534 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044452 2534 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044455 2534 flags.go:64] FLAG: --cgroup-root="" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044458 2534 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044461 2534 flags.go:64] FLAG: --client-ca-file="" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044464 2534 flags.go:64] FLAG: --cloud-config="" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044467 2534 flags.go:64] FLAG: --cloud-provider="external" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044470 2534 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044474 2534 flags.go:64] FLAG: --cluster-domain="" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044477 2534 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044480 2534 flags.go:64] FLAG: --config-dir="" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044483 2534 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044486 2534 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044490 2534 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044494 2534 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044497 2534 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044501 2534 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044504 2534 flags.go:64] FLAG: --contention-profiling="false" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044507 2534 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044510 2534 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044513 2534 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 15:34:22.046639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044517 2534 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044544 2534 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044548 2534 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044551 2534 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044553 2534 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044556 2534 flags.go:64] FLAG: --enable-server="true" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044559 2534 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044564 2534 flags.go:64] FLAG: --event-burst="100" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044567 2534 flags.go:64] FLAG: --event-qps="50" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044571 2534 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044574 2534 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044577 2534 flags.go:64] FLAG: --eviction-hard="" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044581 2534 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044584 2534 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044587 2534 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044590 2534 flags.go:64] FLAG: --eviction-soft="" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044593 2534 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044596 2534 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044599 2534 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044603 2534 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044606 2534 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044609 2534 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044612 2534 flags.go:64] FLAG: --feature-gates="" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044615 2534 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044618 2534 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 15:34:22.047272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044622 2534 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044625 2534 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044628 2534 flags.go:64] FLAG: --healthz-port="10248" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044632 2534 flags.go:64] FLAG: --help="false" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044634 2534 flags.go:64] FLAG: --hostname-override="ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044638 2534 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044641 2534 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044644 2534 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044647 2534 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044651 2534 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044654 2534 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044657 2534 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044660 2534 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044662 2534 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044665 2534 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044668 2534 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044672 2534 flags.go:64] FLAG: --kube-reserved="" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044675 2534 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044678 2534 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044681 2534 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044684 2534 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044687 2534 flags.go:64] FLAG: --lock-file="" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044690 2534 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044693 2534 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 15:34:22.047971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044696 2534 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044702 2534 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044705 2534 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044707 2534 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044710 2534 flags.go:64] FLAG: --logging-format="text" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044713 2534 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044717 2534 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044720 2534 flags.go:64] FLAG: --manifest-url="" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044722 2534 flags.go:64] FLAG: --manifest-url-header="" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044727 2534 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044730 2534 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044734 2534 flags.go:64] FLAG: --max-pods="110" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044737 2534 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044740 2534 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044743 2534 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044746 2534 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044749 2534 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044752 2534 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044756 2534 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044764 2534 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044767 2534 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044770 2534 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044773 2534 flags.go:64] FLAG: --pod-cidr="" Apr 22 15:34:22.048570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044777 2534 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044783 2534 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044786 2534 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044789 2534 flags.go:64] FLAG: --pods-per-core="0" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044792 2534 flags.go:64] FLAG: --port="10250" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044795 2534 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044799 2534 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-091c3ac6236180301" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044802 2534 flags.go:64] FLAG: --qos-reserved="" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044805 2534 flags.go:64] FLAG: --read-only-port="10255" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044808 2534 flags.go:64] FLAG: --register-node="true" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044811 2534 flags.go:64] FLAG: --register-schedulable="true" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044813 2534 flags.go:64] FLAG: --register-with-taints="" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044817 2534 flags.go:64] FLAG: --registry-burst="10" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044820 2534 flags.go:64] FLAG: --registry-qps="5" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044823 2534 flags.go:64] FLAG: --reserved-cpus="" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044825 2534 flags.go:64] FLAG: --reserved-memory="" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044829 2534 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044832 2534 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044835 2534 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044838 2534 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044841 2534 flags.go:64] FLAG: --runonce="false" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044844 2534 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044847 2534 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044850 2534 flags.go:64] FLAG: --seccomp-default="false" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044853 2534 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044856 2534 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 15:34:22.049135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044859 2534 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044862 2534 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044866 2534 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044869 2534 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044872 2534 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044874 2534 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044880 2534 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044883 2534 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044887 2534 flags.go:64] FLAG: --system-cgroups="" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044889 2534 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044895 2534 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044899 2534 flags.go:64] FLAG: --tls-cert-file="" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044901 2534 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044908 2534 flags.go:64] FLAG: --tls-min-version="" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044911 2534 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044914 2534 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044917 2534 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044920 2534 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044923 2534 flags.go:64] FLAG: --v="2" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044927 2534 flags.go:64] FLAG: --version="false" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044931 2534 flags.go:64] FLAG: --vmodule="" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044936 2534 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.044939 2534 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045052 2534 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045056 2534 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:34:22.049818 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045059 2534 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045061 2534 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045064 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045066 2534 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045070 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045072 2534 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045075 2534 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045077 2534 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045080 2534 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045082 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045085 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045088 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045090 2534 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045095 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045097 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045100 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045102 2534 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045104 2534 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045107 2534 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045110 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:34:22.050424 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045112 2534 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045115 2534 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045118 2534 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045120 2534 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045123 2534 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045125 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045128 2534 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045130 2534 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045133 2534 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045135 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045138 2534 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045140 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045142 2534 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045145 2534 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045147 2534 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045150 2534 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045154 2534 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045156 2534 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045160 2534 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045164 2534 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:34:22.051029 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045167 2534 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045169 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045172 2534 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045175 2534 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045177 2534 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045181 2534 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045183 2534 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045186 2534 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045188 2534 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045191 2534 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045194 2534 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045196 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045199 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045201 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045203 2534 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045206 2534 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045208 2534 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045210 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045213 2534 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045215 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:34:22.051615 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045218 2534 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045220 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045224 2534 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045227 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045230 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045233 2534 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045235 2534 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045238 2534 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045242 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045244 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045247 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045249 2534 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045252 2534 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045254 2534 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045257 2534 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045259 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045262 2534 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045266 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045269 2534 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:34:22.052453 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045271 2534 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:34:22.053194 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045274 2534 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:34:22.053194 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045276 2534 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:34:22.053194 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045279 2534 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:34:22.053194 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.045281 2534 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:34:22.053194 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.045938 2534 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:34:22.054615 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.054592 2534 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 15:34:22.054615 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.054616 2534 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054679 2534 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054685 2534 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054688 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054691 2534 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054694 2534 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054696 2534 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054699 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054702 2534 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:34:22.054701 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054704 2534 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054707 2534 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054710 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054713 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054715 2534 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054718 2534 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054720 2534 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054723 2534 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054725 2534 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054728 2534 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054730 2534 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054732 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054735 2534 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054737 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054740 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054744 2534 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054749 2534 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054753 2534 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054766 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:34:22.054938 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054770 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054773 2534 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054776 2534 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054778 2534 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054781 2534 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054783 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054786 2534 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054788 2534 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054791 2534 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054793 2534 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054796 2534 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054798 2534 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054801 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054803 2534 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054806 2534 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054809 2534 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054812 2534 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054816 2534 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054818 2534 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:34:22.055404 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054820 2534 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054823 2534 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054825 2534 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054828 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054830 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054833 2534 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054835 2534 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054837 2534 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054840 2534 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054842 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054845 2534 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054848 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054850 2534 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054853 2534 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054856 2534 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054858 2534 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054861 2534 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054863 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054865 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054868 2534 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:34:22.055881 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054871 2534 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054873 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054875 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054878 2534 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054880 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054883 2534 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054885 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054888 2534 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054891 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054894 2534 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054897 2534 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054899 2534 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054902 2534 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054904 2534 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054907 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054909 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054911 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054914 2534 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054916 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:34:22.056426 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.054918 2534 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.054924 2534 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055029 2534 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055034 2534 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055037 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055040 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055043 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055045 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055049 2534 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055051 2534 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055053 2534 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055056 2534 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055058 2534 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055061 2534 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055063 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 15:34:22.056917 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055065 2534 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055068 2534 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055070 2534 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055073 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055075 2534 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055078 2534 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055080 2534 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055083 2534 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055085 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055089 2534 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055091 2534 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055094 2534 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055096 2534 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055099 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055101 2534 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055104 2534 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055106 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055109 2534 feature_gate.go:328] unrecognized feature gate: Example Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055111 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055114 2534 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 15:34:22.057287 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055116 2534 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055118 2534 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055121 2534 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055125 2534 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055128 2534 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055130 2534 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055133 2534 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055136 2534 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055139 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055141 2534 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055144 2534 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055147 2534 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055149 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055152 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055154 2534 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055157 2534 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055159 2534 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055161 2534 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055164 2534 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 15:34:22.057787 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055166 2534 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055169 2534 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055171 2534 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055175 2534 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055178 2534 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055181 2534 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055183 2534 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055186 2534 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055188 2534 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055191 2534 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055193 2534 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055195 2534 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055198 2534 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055200 2534 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055202 2534 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055205 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055208 2534 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055210 2534 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055213 2534 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055215 2534 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 15:34:22.058283 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055219 2534 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055222 2534 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055225 2534 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055227 2534 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055230 2534 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055232 2534 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055235 2534 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055237 2534 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055240 2534 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055242 2534 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055245 2534 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055247 2534 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055250 2534 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:22.055253 2534 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.055258 2534 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 15:34:22.058880 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.055968 2534 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 15:34:22.059280 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.057971 2534 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 15:34:22.059280 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.058927 2534 server.go:1019] "Starting client certificate rotation" Apr 22 15:34:22.059280 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.059025 2534 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:34:22.059280 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.059065 2534 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 15:34:22.083276 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.083247 2534 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:34:22.085910 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.085881 2534 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 15:34:22.100817 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.100783 2534 log.go:25] "Validated CRI v1 runtime API" Apr 22 15:34:22.106641 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.106620 2534 log.go:25] "Validated CRI v1 image API" Apr 22 15:34:22.107855 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.107836 2534 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 15:34:22.110569 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.110545 2534 fs.go:135] Filesystem UUIDs: map[35ff7eb3-8619-455f-be98-1b44425d5c2a:/dev/nvme0n1p3 72be8af0-f75a-40e5-9aa4-3a72dabd2b8d:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 15:34:22.110677 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.110566 2534 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 15:34:22.113984 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.113960 2534 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:34:22.117990 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.117847 2534 manager.go:217] Machine: {Timestamp:2026-04-22 15:34:22.115640405 +0000 UTC m=+0.396794060 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101828 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec29388d23255fa310500070353605c2 SystemUUID:ec29388d-2325-5fa3-1050-0070353605c2 BootID:e6179253-6fde-4a79-b43e-3975b041ffa4 Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:b3:4d:13:a6:b9 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:b3:4d:13:a6:b9 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f6:de:3e:2d:0f:cc Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 15:34:22.117990 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.117984 2534 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 15:34:22.118123 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.118101 2534 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 15:34:22.120891 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.120855 2534 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 15:34:22.121053 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.120894 2534 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-86.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 15:34:22.121141 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.121063 2534 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 15:34:22.121141 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.121074 2534 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 15:34:22.121141 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.121087 2534 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:34:22.122068 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.122054 2534 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 15:34:22.122955 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.122942 2534 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:34:22.123091 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.123081 2534 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 15:34:22.126287 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.126271 2534 kubelet.go:491] "Attempting to sync node with API server" Apr 22 15:34:22.126330 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.126292 2534 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 15:34:22.126330 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.126312 2534 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 15:34:22.126330 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.126327 2534 kubelet.go:397] "Adding apiserver pod source" Apr 22 15:34:22.126402 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.126336 2534 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 15:34:22.127967 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.127952 2534 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:34:22.128024 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.127978 2534 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 15:34:22.131237 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.131218 2534 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 15:34:22.133223 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.133208 2534 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 15:34:22.134815 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134801 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 15:34:22.134856 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134824 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 15:34:22.134856 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134834 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 15:34:22.134856 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134843 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 15:34:22.134856 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134851 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 15:34:22.134970 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134863 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 15:34:22.134970 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134873 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 15:34:22.134970 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134881 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 15:34:22.134970 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134890 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 15:34:22.134970 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134899 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 15:34:22.134970 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134913 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 15:34:22.134970 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.134925 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 15:34:22.136425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.136404 2534 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ndgkm" Apr 22 15:34:22.136476 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.136464 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 15:34:22.136507 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.136479 2534 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 15:34:22.137857 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.137837 2534 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-86.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 15:34:22.137927 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.137837 2534 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 15:34:22.140368 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.140354 2534 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 15:34:22.140425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.140393 2534 server.go:1295] "Started kubelet" Apr 22 15:34:22.140518 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.140482 2534 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 15:34:22.140636 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.140570 2534 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 15:34:22.140674 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.140661 2534 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 15:34:22.141359 ip-10-0-130-86 systemd[1]: Started Kubernetes Kubelet. Apr 22 15:34:22.142345 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.142327 2534 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 15:34:22.142413 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.142395 2534 server.go:317] "Adding debug handlers to kubelet server" Apr 22 15:34:22.144868 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.144826 2534 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ndgkm" Apr 22 15:34:22.146472 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.146452 2534 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 15:34:22.146914 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.145934 2534 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-86.ec2.internal.18a8b7b9eb314510 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-86.ec2.internal,UID:ip-10-0-130-86.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-86.ec2.internal,},FirstTimestamp:2026-04-22 15:34:22.140368144 +0000 UTC m=+0.421521776,LastTimestamp:2026-04-22 15:34:22.140368144 +0000 UTC m=+0.421521776,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-86.ec2.internal,}" Apr 22 15:34:22.147132 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.147076 2534 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 15:34:22.147961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.147875 2534 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 15:34:22.147961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.147899 2534 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 15:34:22.147961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.147909 2534 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 15:34:22.148150 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.148116 2534 reconstruct.go:97] "Volume reconstruction finished" Apr 22 15:34:22.148150 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.148127 2534 reconciler.go:26] "Reconciler: start to sync state" Apr 22 15:34:22.148358 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.148324 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.150052 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.150034 2534 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 15:34:22.150052 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.150053 2534 factory.go:55] Registering systemd factory Apr 22 15:34:22.150212 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.150063 2534 factory.go:223] Registration of the systemd container factory successfully Apr 22 15:34:22.150396 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.150377 2534 factory.go:153] Registering CRI-O factory Apr 22 15:34:22.150396 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.150398 2534 factory.go:223] Registration of the crio container factory successfully Apr 22 15:34:22.150575 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.150424 2534 factory.go:103] Registering Raw factory Apr 22 15:34:22.150575 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.150439 2534 manager.go:1196] Started watching for new ooms in manager Apr 22 15:34:22.151641 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.151246 2534 manager.go:319] Starting recovery of all containers Apr 22 15:34:22.155724 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.155688 2534 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:34:22.156128 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.156101 2534 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 15:34:22.159422 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.159391 2534 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-86.ec2.internal\" not found" node="ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.162836 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.162689 2534 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-86.ec2.internal" not found Apr 22 15:34:22.165862 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.165843 2534 manager.go:324] Recovery completed Apr 22 15:34:22.170272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.170257 2534 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:34:22.172948 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.172931 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:34:22.173015 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.172966 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:34:22.173015 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.172979 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:34:22.173495 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.173477 2534 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 15:34:22.173495 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.173490 2534 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 15:34:22.173649 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.173506 2534 state_mem.go:36] "Initialized new in-memory state store" Apr 22 15:34:22.175994 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.175978 2534 policy_none.go:49] "None policy: Start" Apr 22 15:34:22.175994 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.175998 2534 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 15:34:22.176102 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.176011 2534 state_mem.go:35] "Initializing new in-memory state store" Apr 22 15:34:22.180349 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.180325 2534 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-86.ec2.internal" not found Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.218863 2534 manager.go:341] "Starting Device Plugin manager" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.218918 2534 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.218930 2534 server.go:85] "Starting device plugin registration server" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.219179 2534 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.219209 2534 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.219301 2534 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.219386 2534 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.219395 2534 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.219972 2534 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 15:34:22.232639 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.220014 2534 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.236440 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.236417 2534 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-130-86.ec2.internal" not found Apr 22 15:34:22.280272 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.280211 2534 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 15:34:22.281637 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.281607 2534 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 15:34:22.281752 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.281655 2534 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 15:34:22.281752 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.281689 2534 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 15:34:22.281752 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.281698 2534 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 15:34:22.281868 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.281742 2534 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 15:34:22.283979 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.283958 2534 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:34:22.320211 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.320106 2534 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:34:22.321371 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.321354 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:34:22.321444 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.321389 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:34:22.321444 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.321404 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:34:22.321444 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.321434 2534 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.330318 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.330292 2534 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.330386 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.330329 2534 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-86.ec2.internal\": node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.346455 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.346425 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.382399 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.382363 2534 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal"] Apr 22 15:34:22.382480 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.382469 2534 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:34:22.384590 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.384572 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:34:22.384694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.384602 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:34:22.384694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.384617 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:34:22.386156 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.386141 2534 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:34:22.386296 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.386282 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.386334 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.386319 2534 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:34:22.387830 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.387814 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:34:22.387830 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.387821 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:34:22.387937 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.387840 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:34:22.387937 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.387843 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:34:22.387937 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.387850 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:34:22.387937 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.387853 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:34:22.389203 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.389190 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.389242 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.389218 2534 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 15:34:22.390148 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.390128 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientMemory" Apr 22 15:34:22.390250 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.390162 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 15:34:22.390250 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.390178 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeHasSufficientPID" Apr 22 15:34:22.417403 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.417377 2534 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-86.ec2.internal\" not found" node="ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.421987 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.421966 2534 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-86.ec2.internal\" not found" node="ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.447328 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.447296 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.450675 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.450640 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79b905854054fe4837d8eeee581d56c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal\" (UID: \"79b905854054fe4837d8eeee581d56c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.450781 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.450695 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79b905854054fe4837d8eeee581d56c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal\" (UID: \"79b905854054fe4837d8eeee581d56c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.450781 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.450717 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/93607452ba047e869102040d23558016-config\") pod \"kube-apiserver-proxy-ip-10-0-130-86.ec2.internal\" (UID: \"93607452ba047e869102040d23558016\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.547466 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.547424 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.551771 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.551750 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79b905854054fe4837d8eeee581d56c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal\" (UID: \"79b905854054fe4837d8eeee581d56c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.551871 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.551780 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/93607452ba047e869102040d23558016-config\") pod \"kube-apiserver-proxy-ip-10-0-130-86.ec2.internal\" (UID: \"93607452ba047e869102040d23558016\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.551871 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.551801 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79b905854054fe4837d8eeee581d56c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal\" (UID: \"79b905854054fe4837d8eeee581d56c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.551871 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.551860 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/93607452ba047e869102040d23558016-config\") pod \"kube-apiserver-proxy-ip-10-0-130-86.ec2.internal\" (UID: \"93607452ba047e869102040d23558016\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.551986 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.551860 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/79b905854054fe4837d8eeee581d56c0-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal\" (UID: \"79b905854054fe4837d8eeee581d56c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.551986 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.551859 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/79b905854054fe4837d8eeee581d56c0-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal\" (UID: \"79b905854054fe4837d8eeee581d56c0\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.648298 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.648216 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.719612 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.719589 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.724318 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.724292 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" Apr 22 15:34:22.749260 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.749214 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.849754 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.849716 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.950279 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:22.950228 2534 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-86.ec2.internal\" not found" Apr 22 15:34:22.979175 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:22.979144 2534 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:34:23.048175 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.048125 2534 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:23.058688 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.058642 2534 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 15:34:23.058870 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.058852 2534 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:34:23.058940 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.058866 2534 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:34:23.058940 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.058859 2534 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 15:34:23.059567 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.058933 2534 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://a5aba199340a8407ab2a8b9ce1908086-2fc34e39f94b1425.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.130.86:36504->3.214.47.229:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" Apr 22 15:34:23.059567 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.058968 2534 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" Apr 22 15:34:23.080847 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.080817 2534 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 15:34:23.127320 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.127278 2534 apiserver.go:52] "Watching apiserver" Apr 22 15:34:23.135936 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.135896 2534 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 15:34:23.138901 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.138860 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh","openshift-cluster-node-tuning-operator/tuned-z7pk7","openshift-image-registry/node-ca-l947q","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal","openshift-multus/network-metrics-daemon-82tqk","openshift-network-diagnostics/network-check-target-h8d7v","kube-system/konnectivity-agent-6vjfh","openshift-dns/node-resolver-wbjzt","openshift-multus/multus-additional-cni-plugins-7b4hc","openshift-multus/multus-vtcgb","openshift-network-operator/iptables-alerter-hdcgf","openshift-ovn-kubernetes/ovnkube-node-fpdtl"] Apr 22 15:34:23.140337 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.140313 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:23.140448 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.140424 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:23.141553 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.141517 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.142823 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.142790 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.143966 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.143944 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.144177 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.144156 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.144372 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.144352 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 15:34:23.144912 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.144890 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.144912 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.144907 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wthx2\"" Apr 22 15:34:23.145054 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.144926 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.145293 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.145276 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:23.145353 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.145313 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-gmt97\"" Apr 22 15:34:23.145401 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.145343 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:23.145565 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.145551 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.146050 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.146032 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 15:34:23.146151 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.146127 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.146511 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.146492 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-bwr7v\"" Apr 22 15:34:23.146619 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.146514 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.146619 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.146546 2534 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 15:34:23.146978 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.146960 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.147046 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.146988 2534 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 15:29:22 +0000 UTC" deadline="2027-12-24 18:51:06.304582062 +0000 UTC" Apr 22 15:34:23.147046 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.147012 2534 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14667h16m43.157573945s" Apr 22 15:34:23.148966 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.148939 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 15:34:23.149255 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.149240 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.149362 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.149344 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.149648 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.149630 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 15:34:23.149738 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.149722 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-zdcdt\"" Apr 22 15:34:23.150579 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.150509 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.151382 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.151362 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-ldwqs\"" Apr 22 15:34:23.151762 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.151742 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 15:34:23.151762 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.151846 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.151762 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.151866 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.151762 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.151891 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 15:34:23.151762 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.151850 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.151762 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.152011 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zbbc9\"" Apr 22 15:34:23.152398 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.152300 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.152398 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.152338 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 15:34:23.152649 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.152634 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 15:34:23.153229 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.153214 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.153306 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.153282 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-m6qv5\"" Apr 22 15:34:23.153592 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.153453 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.154831 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.154810 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.155268 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155246 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-222gn\"" Apr 22 15:34:23.155354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155246 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.155354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155280 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 15:34:23.155595 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155574 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-host\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.155701 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155602 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-modprobe-d\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.155701 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155618 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-run\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.155701 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155634 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-host\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.155701 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155664 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-etc-selinux\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.155904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155758 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 15:34:23.155904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155759 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 15:34:23.155904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155758 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.155904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155813 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-db5kj\"" Apr 22 15:34:23.155904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155820 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 15:34:23.155904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155860 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-device-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.155904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155890 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk5fd\" (UniqueName: \"kubernetes.io/projected/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-kube-api-access-tk5fd\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155940 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155969 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95598f8a-db85-47f2-859f-d47efcdbfa09-konnectivity-ca\") pod \"konnectivity-agent-6vjfh\" (UID: \"95598f8a-db85-47f2-859f-d47efcdbfa09\") " pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156008 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.155995 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-socket-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156075 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-lib-modules\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156107 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8781a272-57f8-42fd-84df-15814bf56a2a-etc-tuned\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156141 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxkbb\" (UniqueName: \"kubernetes.io/projected/c2e72f84-cb38-472e-abba-c2f44adaf2fd-kube-api-access-fxkbb\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156173 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-hosts-file\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156199 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156218 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 15:34:23.156225 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156225 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-registration-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156251 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysconfig\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156273 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysctl-d\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156299 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysctl-conf\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156345 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-var-lib-kubelet\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156370 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xr8d\" (UniqueName: \"kubernetes.io/projected/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-kube-api-access-7xr8d\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156406 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-kubernetes\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156426 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8781a272-57f8-42fd-84df-15814bf56a2a-tmp\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156449 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6jx4\" (UniqueName: \"kubernetes.io/projected/22adcb4d-4eb7-401a-8a80-b9d872a481d3-kube-api-access-z6jx4\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156470 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-systemd\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156566 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42bcv\" (UniqueName: \"kubernetes.io/projected/8781a272-57f8-42fd-84df-15814bf56a2a-kube-api-access-42bcv\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156610 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95598f8a-db85-47f2-859f-d47efcdbfa09-agent-certs\") pod \"konnectivity-agent-6vjfh\" (UID: \"95598f8a-db85-47f2-859f-d47efcdbfa09\") " pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156639 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-serviceca\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.156695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156676 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-sys\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.157132 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156713 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-sys-fs\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.157132 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156784 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-tmp-dir\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.157132 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.156921 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 15:34:23.165003 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.164969 2534 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 15:34:23.189473 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.189441 2534 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-br5vh" Apr 22 15:34:23.199943 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.199912 2534 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-br5vh" Apr 22 15:34:23.214739 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.214702 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79b905854054fe4837d8eeee581d56c0.slice/crio-b5f2edad41aef69d9bf2177c46e81249072488b758e8bbe9a2990835e9bdaca9 WatchSource:0}: Error finding container b5f2edad41aef69d9bf2177c46e81249072488b758e8bbe9a2990835e9bdaca9: Status 404 returned error can't find the container with id b5f2edad41aef69d9bf2177c46e81249072488b758e8bbe9a2990835e9bdaca9 Apr 22 15:34:23.215359 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.215326 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93607452ba047e869102040d23558016.slice/crio-a68b1dac232c089083e23d608b3ba0ad9b4b940050fb52c0d0e29618aa1381b2 WatchSource:0}: Error finding container a68b1dac232c089083e23d608b3ba0ad9b4b940050fb52c0d0e29618aa1381b2: Status 404 returned error can't find the container with id a68b1dac232c089083e23d608b3ba0ad9b4b940050fb52c0d0e29618aa1381b2 Apr 22 15:34:23.219889 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.219868 2534 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:34:23.249475 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.249448 2534 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 15:34:23.257172 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257136 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-system-cni-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.257172 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257176 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-etc-kubernetes\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257201 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxtk\" (UniqueName: \"kubernetes.io/projected/d870e76b-ada6-4b96-8ffa-57ad8f8da412-kube-api-access-bhxtk\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257225 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-sys-fs\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257243 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-host\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257274 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-run\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257292 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-host\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257296 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-sys-fs\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257309 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-cnibin\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257334 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-host\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257355 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-run\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.257370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257365 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-host\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257363 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257400 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovn-node-metrics-cert\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257421 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-k8s-cni-cncf-io\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257443 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-conf-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257468 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257491 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kp66\" (UniqueName: \"kubernetes.io/projected/ca5aa4d2-6513-4631-aeb7-c9120934e117-kube-api-access-5kp66\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257508 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-cni-bin\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257543 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-socket-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.257632 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:23.257727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257646 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8781a272-57f8-42fd-84df-15814bf56a2a-etc-tuned\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257750 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-socket-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257748 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxkbb\" (UniqueName: \"kubernetes.io/projected/c2e72f84-cb38-472e-abba-c2f44adaf2fd-kube-api-access-fxkbb\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.257780 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:34:23.757740019 +0000 UTC m=+2.038893642 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257893 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-netns\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257936 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-cni-bin\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257955 2534 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.257991 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-multus-certs\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258042 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-cni-binary-copy\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258067 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258095 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-hosts-file\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258117 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-registration-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.258153 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258145 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysconfig\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258222 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-hosts-file\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258225 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-var-lib-kubelet\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258251 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-registration-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258277 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-var-lib-kubelet\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258276 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-daemon-config\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258313 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysconfig\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258321 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-slash\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258378 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-var-lib-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258404 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258422 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xr8d\" (UniqueName: \"kubernetes.io/projected/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-kube-api-access-7xr8d\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258438 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f42d3546-8a90-41a3-a8b3-01565e6ed78c-iptables-alerter-script\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258454 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f42d3546-8a90-41a3-a8b3-01565e6ed78c-host-slash\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258469 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-run-netns\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258484 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-etc-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258543 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258578 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-systemd\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.258673 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258608 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-socket-dir-parent\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258632 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-ovn\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258666 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-systemd\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258668 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-cni-netd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258705 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gjrd\" (UniqueName: \"kubernetes.io/projected/92d55e47-0b30-4f98-aff7-3b7325bb3839-kube-api-access-2gjrd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258725 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-serviceca\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258743 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-tmp-dir\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258763 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-node-log\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258779 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-modprobe-d\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258822 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-kubelet\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258842 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-systemd-units\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258855 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-systemd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258873 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-os-release\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258889 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-cni-multus\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258912 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-etc-selinux\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258947 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-etc-selinux\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258947 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.259317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258969 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-modprobe-d\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.258979 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-device-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259005 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk5fd\" (UniqueName: \"kubernetes.io/projected/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-kube-api-access-tk5fd\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259026 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259029 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95598f8a-db85-47f2-859f-d47efcdbfa09-konnectivity-ca\") pod \"konnectivity-agent-6vjfh\" (UID: \"95598f8a-db85-47f2-859f-d47efcdbfa09\") " pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259066 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-lib-modules\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259095 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-os-release\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259115 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/22adcb4d-4eb7-401a-8a80-b9d872a481d3-device-dir\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259119 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-cni-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259160 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259170 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-tmp-dir\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259196 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-log-socket\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259215 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-lib-modules\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259252 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-env-overrides\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259286 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259349 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysctl-d\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259374 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-serviceca\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.259908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259422 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysctl-conf\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259453 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovnkube-config\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259467 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysctl-d\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259483 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-kubernetes\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259512 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8781a272-57f8-42fd-84df-15814bf56a2a-tmp\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259551 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-kubernetes\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259554 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-cnibin\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259591 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-kubelet\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259605 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-hostroot\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259605 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-etc-sysctl-conf\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259619 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-system-cni-dir\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259634 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/95598f8a-db85-47f2-859f-d47efcdbfa09-konnectivity-ca\") pod \"konnectivity-agent-6vjfh\" (UID: \"95598f8a-db85-47f2-859f-d47efcdbfa09\") " pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259654 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259686 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqrwd\" (UniqueName: \"kubernetes.io/projected/f42d3546-8a90-41a3-a8b3-01565e6ed78c-kube-api-access-gqrwd\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259714 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6jx4\" (UniqueName: \"kubernetes.io/projected/22adcb4d-4eb7-401a-8a80-b9d872a481d3-kube-api-access-z6jx4\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259739 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42bcv\" (UniqueName: \"kubernetes.io/projected/8781a272-57f8-42fd-84df-15814bf56a2a-kube-api-access-42bcv\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259764 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95598f8a-db85-47f2-859f-d47efcdbfa09-agent-certs\") pod \"konnectivity-agent-6vjfh\" (UID: \"95598f8a-db85-47f2-859f-d47efcdbfa09\") " pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.260450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259792 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d870e76b-ada6-4b96-8ffa-57ad8f8da412-cni-binary-copy\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.260969 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259818 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovnkube-script-lib\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.260969 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259856 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-sys\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.260969 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.259972 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8781a272-57f8-42fd-84df-15814bf56a2a-sys\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.261304 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.261285 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/8781a272-57f8-42fd-84df-15814bf56a2a-etc-tuned\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.261724 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.261708 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/8781a272-57f8-42fd-84df-15814bf56a2a-tmp\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.262040 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.262024 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/95598f8a-db85-47f2-859f-d47efcdbfa09-agent-certs\") pod \"konnectivity-agent-6vjfh\" (UID: \"95598f8a-db85-47f2-859f-d47efcdbfa09\") " pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.267494 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.267471 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:23.267494 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.267492 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:23.267494 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.267502 2534 projected.go:194] Error preparing data for projected volume kube-api-access-xqz8g for pod openshift-network-diagnostics/network-check-target-h8d7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:23.267761 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.267622 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g podName:7febb925-9a97-4316-9acd-71100c471eb1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:23.767579241 +0000 UTC m=+2.048732862 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xqz8g" (UniqueName: "kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g") pod "network-check-target-h8d7v" (UID: "7febb925-9a97-4316-9acd-71100c471eb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:23.268835 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.268806 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xr8d\" (UniqueName: \"kubernetes.io/projected/1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0-kube-api-access-7xr8d\") pod \"node-resolver-wbjzt\" (UID: \"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0\") " pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.269012 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.268997 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxkbb\" (UniqueName: \"kubernetes.io/projected/c2e72f84-cb38-472e-abba-c2f44adaf2fd-kube-api-access-fxkbb\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:23.269092 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.269070 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk5fd\" (UniqueName: \"kubernetes.io/projected/60d1d72d-9ad8-4148-82bc-8ae8873fe4c8-kube-api-access-tk5fd\") pod \"node-ca-l947q\" (UID: \"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8\") " pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.269384 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.269368 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6jx4\" (UniqueName: \"kubernetes.io/projected/22adcb4d-4eb7-401a-8a80-b9d872a481d3-kube-api-access-z6jx4\") pod \"aws-ebs-csi-driver-node-mrcgh\" (UID: \"22adcb4d-4eb7-401a-8a80-b9d872a481d3\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.270555 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.270516 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42bcv\" (UniqueName: \"kubernetes.io/projected/8781a272-57f8-42fd-84df-15814bf56a2a-kube-api-access-42bcv\") pod \"tuned-z7pk7\" (UID: \"8781a272-57f8-42fd-84df-15814bf56a2a\") " pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.285710 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.285651 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" event={"ID":"79b905854054fe4837d8eeee581d56c0","Type":"ContainerStarted","Data":"b5f2edad41aef69d9bf2177c46e81249072488b758e8bbe9a2990835e9bdaca9"} Apr 22 15:34:23.286628 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.286606 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" event={"ID":"93607452ba047e869102040d23558016","Type":"ContainerStarted","Data":"a68b1dac232c089083e23d608b3ba0ad9b4b940050fb52c0d0e29618aa1381b2"} Apr 22 15:34:23.361050 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361006 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-system-cni-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361050 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361050 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-etc-kubernetes\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361050 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361066 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxtk\" (UniqueName: \"kubernetes.io/projected/d870e76b-ada6-4b96-8ffa-57ad8f8da412-kube-api-access-bhxtk\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361084 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-cnibin\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361101 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361121 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovn-node-metrics-cert\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361133 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-etc-kubernetes\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361135 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-system-cni-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361143 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-k8s-cni-cncf-io\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361170 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-cnibin\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361185 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361177 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-k8s-cni-cncf-io\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361201 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-conf-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361247 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kp66\" (UniqueName: \"kubernetes.io/projected/ca5aa4d2-6513-4631-aeb7-c9120934e117-kube-api-access-5kp66\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361271 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-conf-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361339 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361330 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-cni-bin\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361363 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-cni-bin\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361368 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-netns\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361393 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-cni-bin\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361410 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-multus-certs\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361429 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-cni-binary-copy\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361478 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-netns\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361514 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361568 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-run-multus-certs\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361574 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-daemon-config\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361617 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-cni-bin\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361621 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-slash\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361647 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-slash\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361668 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-var-lib-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361695 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361727 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f42d3546-8a90-41a3-a8b3-01565e6ed78c-iptables-alerter-script\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361730 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-var-lib-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.361933 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361760 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-run-ovn-kubernetes\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361765 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f42d3546-8a90-41a3-a8b3-01565e6ed78c-host-slash\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361858 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f42d3546-8a90-41a3-a8b3-01565e6ed78c-host-slash\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361870 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-run-netns\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.361988 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-run-netns\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362020 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-etc-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362044 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362066 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-etc-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362072 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-cni-binary-copy\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362071 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-socket-dir-parent\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362090 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-daemon-config\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362122 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-socket-dir-parent\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362123 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-ovn\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362154 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-ovn\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362158 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-cni-netd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362133 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-openvswitch\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362174 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.362694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362186 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gjrd\" (UniqueName: \"kubernetes.io/projected/92d55e47-0b30-4f98-aff7-3b7325bb3839-kube-api-access-2gjrd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362209 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-cni-netd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362219 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-node-log\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362242 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-kubelet\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362259 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-systemd-units\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362273 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-systemd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362272 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/f42d3546-8a90-41a3-a8b3-01565e6ed78c-iptables-alerter-script\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362297 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-os-release\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362314 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-node-log\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362323 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-cni-multus\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362327 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-host-kubelet\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362336 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-systemd-units\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362351 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-os-release\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362366 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-cni-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362373 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-os-release\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362390 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362391 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-cni-multus\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362417 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-log-socket\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363425 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362424 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-os-release\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362435 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-multus-cni-dir\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362452 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-log-socket\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362443 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/92d55e47-0b30-4f98-aff7-3b7325bb3839-run-systemd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362451 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-env-overrides\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362512 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovnkube-config\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362562 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-cnibin\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362579 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-kubelet\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362593 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-hostroot\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362608 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-system-cni-dir\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362630 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362649 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqrwd\" (UniqueName: \"kubernetes.io/projected/f42d3546-8a90-41a3-a8b3-01565e6ed78c-kube-api-access-gqrwd\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362677 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d870e76b-ada6-4b96-8ffa-57ad8f8da412-cni-binary-copy\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362679 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-host-var-lib-kubelet\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362699 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovnkube-script-lib\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362728 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-cnibin\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362754 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d870e76b-ada6-4b96-8ffa-57ad8f8da412-hostroot\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362786 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-system-cni-dir\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.363920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362884 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/ca5aa4d2-6513-4631-aeb7-c9120934e117-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.364456 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362896 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ca5aa4d2-6513-4631-aeb7-c9120934e117-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.364456 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.362984 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-env-overrides\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.364456 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.363040 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovnkube-config\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.364456 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.363154 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovnkube-script-lib\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.364456 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.363323 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d870e76b-ada6-4b96-8ffa-57ad8f8da412-cni-binary-copy\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.364456 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.363722 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/92d55e47-0b30-4f98-aff7-3b7325bb3839-ovn-node-metrics-cert\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.380219 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.380184 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxtk\" (UniqueName: \"kubernetes.io/projected/d870e76b-ada6-4b96-8ffa-57ad8f8da412-kube-api-access-bhxtk\") pod \"multus-vtcgb\" (UID: \"d870e76b-ada6-4b96-8ffa-57ad8f8da412\") " pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.380597 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.380574 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqrwd\" (UniqueName: \"kubernetes.io/projected/f42d3546-8a90-41a3-a8b3-01565e6ed78c-kube-api-access-gqrwd\") pod \"iptables-alerter-hdcgf\" (UID: \"f42d3546-8a90-41a3-a8b3-01565e6ed78c\") " pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.380886 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.380870 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gjrd\" (UniqueName: \"kubernetes.io/projected/92d55e47-0b30-4f98-aff7-3b7325bb3839-kube-api-access-2gjrd\") pod \"ovnkube-node-fpdtl\" (UID: \"92d55e47-0b30-4f98-aff7-3b7325bb3839\") " pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.382424 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.382409 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kp66\" (UniqueName: \"kubernetes.io/projected/ca5aa4d2-6513-4631-aeb7-c9120934e117-kube-api-access-5kp66\") pod \"multus-additional-cni-plugins-7b4hc\" (UID: \"ca5aa4d2-6513-4631-aeb7-c9120934e117\") " pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.463424 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.463332 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" Apr 22 15:34:23.470150 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.470116 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22adcb4d_4eb7_401a_8a80_b9d872a481d3.slice/crio-779ca106e17b0d29a6d1c733bd8dfce471a8e8df9ad2a503b9bff777e3fd0e41 WatchSource:0}: Error finding container 779ca106e17b0d29a6d1c733bd8dfce471a8e8df9ad2a503b9bff777e3fd0e41: Status 404 returned error can't find the container with id 779ca106e17b0d29a6d1c733bd8dfce471a8e8df9ad2a503b9bff777e3fd0e41 Apr 22 15:34:23.479502 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.479472 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" Apr 22 15:34:23.487838 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.487803 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8781a272_57f8_42fd_84df_15814bf56a2a.slice/crio-a9f0385c4de8f28e9a2c91924a75947508ea006ce84453ba6b11696ae2452b78 WatchSource:0}: Error finding container a9f0385c4de8f28e9a2c91924a75947508ea006ce84453ba6b11696ae2452b78: Status 404 returned error can't find the container with id a9f0385c4de8f28e9a2c91924a75947508ea006ce84453ba6b11696ae2452b78 Apr 22 15:34:23.493934 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.493910 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l947q" Apr 22 15:34:23.498691 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.498672 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:23.499444 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.499423 2534 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:34:23.500411 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.500386 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60d1d72d_9ad8_4148_82bc_8ae8873fe4c8.slice/crio-3d8f3cc7e0087a89f999425449a85d3f1310da4b62ea988290ee499e47d3ec6d WatchSource:0}: Error finding container 3d8f3cc7e0087a89f999425449a85d3f1310da4b62ea988290ee499e47d3ec6d: Status 404 returned error can't find the container with id 3d8f3cc7e0087a89f999425449a85d3f1310da4b62ea988290ee499e47d3ec6d Apr 22 15:34:23.505985 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.505954 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95598f8a_db85_47f2_859f_d47efcdbfa09.slice/crio-a486179661f33e4e15c888bccaa12bfe66ee8cf8b4eff0b6c05017aa639e0c7a WatchSource:0}: Error finding container a486179661f33e4e15c888bccaa12bfe66ee8cf8b4eff0b6c05017aa639e0c7a: Status 404 returned error can't find the container with id a486179661f33e4e15c888bccaa12bfe66ee8cf8b4eff0b6c05017aa639e0c7a Apr 22 15:34:23.525054 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.525018 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wbjzt" Apr 22 15:34:23.531025 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.530998 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" Apr 22 15:34:23.533032 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.532997 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c6ce3c0_cf70_422a_a8f8_3889b4bcc3d0.slice/crio-7addddf8ef63a97ec887e465e6f8c734e60806dad70df6b24149986d3691652a WatchSource:0}: Error finding container 7addddf8ef63a97ec887e465e6f8c734e60806dad70df6b24149986d3691652a: Status 404 returned error can't find the container with id 7addddf8ef63a97ec887e465e6f8c734e60806dad70df6b24149986d3691652a Apr 22 15:34:23.534278 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.534239 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vtcgb" Apr 22 15:34:23.538545 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.538504 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca5aa4d2_6513_4631_aeb7_c9120934e117.slice/crio-6b043ec0bc06310ab832b844299aa7dce4b09cf20ef8ab3e7939aa6eae955714 WatchSource:0}: Error finding container 6b043ec0bc06310ab832b844299aa7dce4b09cf20ef8ab3e7939aa6eae955714: Status 404 returned error can't find the container with id 6b043ec0bc06310ab832b844299aa7dce4b09cf20ef8ab3e7939aa6eae955714 Apr 22 15:34:23.539498 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.539480 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hdcgf" Apr 22 15:34:23.540627 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.540602 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd870e76b_ada6_4b96_8ffa_57ad8f8da412.slice/crio-302c24a4cf969c6faff1691a5473d4403b655325190e29811162715a428f296d WatchSource:0}: Error finding container 302c24a4cf969c6faff1691a5473d4403b655325190e29811162715a428f296d: Status 404 returned error can't find the container with id 302c24a4cf969c6faff1691a5473d4403b655325190e29811162715a428f296d Apr 22 15:34:23.544497 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.544475 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:23.546693 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.546670 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf42d3546_8a90_41a3_a8b3_01565e6ed78c.slice/crio-046c177da6972474a67f2dcf565183549cda2af238c2d6b7f91285ff88168587 WatchSource:0}: Error finding container 046c177da6972474a67f2dcf565183549cda2af238c2d6b7f91285ff88168587: Status 404 returned error can't find the container with id 046c177da6972474a67f2dcf565183549cda2af238c2d6b7f91285ff88168587 Apr 22 15:34:23.552690 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:34:23.552665 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d55e47_0b30_4f98_aff7_3b7325bb3839.slice/crio-26f6405064107b18ebe35bcfbac204fbb9da3d59ec7b2cc7b13c3e16fb0d5ed8 WatchSource:0}: Error finding container 26f6405064107b18ebe35bcfbac204fbb9da3d59ec7b2cc7b13c3e16fb0d5ed8: Status 404 returned error can't find the container with id 26f6405064107b18ebe35bcfbac204fbb9da3d59ec7b2cc7b13c3e16fb0d5ed8 Apr 22 15:34:23.766030 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.765936 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:23.766191 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.766088 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:23.766191 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.766150 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.76613564 +0000 UTC m=+3.047289264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:23.866755 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:23.866637 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:23.866927 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.866914 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:23.866977 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.866932 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:23.866977 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.866941 2534 projected.go:194] Error preparing data for projected volume kube-api-access-xqz8g for pod openshift-network-diagnostics/network-check-target-h8d7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:23.867126 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:23.867022 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g podName:7febb925-9a97-4316-9acd-71100c471eb1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:24.866985279 +0000 UTC m=+3.148138900 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqz8g" (UniqueName: "kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g") pod "network-check-target-h8d7v" (UID: "7febb925-9a97-4316-9acd-71100c471eb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:24.018857 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.018773 2534 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:34:24.201302 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.201121 2534 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:29:23 +0000 UTC" deadline="2028-02-04 21:56:39.471302618 +0000 UTC" Apr 22 15:34:24.201302 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.201175 2534 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15678h22m15.270132235s" Apr 22 15:34:24.325781 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.325411 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"26f6405064107b18ebe35bcfbac204fbb9da3d59ec7b2cc7b13c3e16fb0d5ed8"} Apr 22 15:34:24.339754 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.339712 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hdcgf" event={"ID":"f42d3546-8a90-41a3-a8b3-01565e6ed78c","Type":"ContainerStarted","Data":"046c177da6972474a67f2dcf565183549cda2af238c2d6b7f91285ff88168587"} Apr 22 15:34:24.353344 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.353296 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vtcgb" event={"ID":"d870e76b-ada6-4b96-8ffa-57ad8f8da412","Type":"ContainerStarted","Data":"302c24a4cf969c6faff1691a5473d4403b655325190e29811162715a428f296d"} Apr 22 15:34:24.355582 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.355518 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wbjzt" event={"ID":"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0","Type":"ContainerStarted","Data":"7addddf8ef63a97ec887e465e6f8c734e60806dad70df6b24149986d3691652a"} Apr 22 15:34:24.381963 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.381876 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6vjfh" event={"ID":"95598f8a-db85-47f2-859f-d47efcdbfa09","Type":"ContainerStarted","Data":"a486179661f33e4e15c888bccaa12bfe66ee8cf8b4eff0b6c05017aa639e0c7a"} Apr 22 15:34:24.393288 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.392956 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" event={"ID":"22adcb4d-4eb7-401a-8a80-b9d872a481d3","Type":"ContainerStarted","Data":"779ca106e17b0d29a6d1c733bd8dfce471a8e8df9ad2a503b9bff777e3fd0e41"} Apr 22 15:34:24.411268 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.411224 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerStarted","Data":"6b043ec0bc06310ab832b844299aa7dce4b09cf20ef8ab3e7939aa6eae955714"} Apr 22 15:34:24.437184 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.436966 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l947q" event={"ID":"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8","Type":"ContainerStarted","Data":"3d8f3cc7e0087a89f999425449a85d3f1310da4b62ea988290ee499e47d3ec6d"} Apr 22 15:34:24.461142 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.461098 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" event={"ID":"8781a272-57f8-42fd-84df-15814bf56a2a","Type":"ContainerStarted","Data":"a9f0385c4de8f28e9a2c91924a75947508ea006ce84453ba6b11696ae2452b78"} Apr 22 15:34:24.653886 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.653786 2534 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:34:24.774086 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.774048 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:24.774274 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:24.774204 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:24.774274 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:24.774271 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:34:26.774251367 +0000 UTC m=+5.055405000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:24.875594 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.875555 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:24.875785 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:24.875719 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:24.875785 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:24.875740 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:24.875785 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:24.875752 2534 projected.go:194] Error preparing data for projected volume kube-api-access-xqz8g for pod openshift-network-diagnostics/network-check-target-h8d7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:24.875939 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:24.875815 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g podName:7febb925-9a97-4316-9acd-71100c471eb1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:26.875795308 +0000 UTC m=+5.156948929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqz8g" (UniqueName: "kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g") pod "network-check-target-h8d7v" (UID: "7febb925-9a97-4316-9acd-71100c471eb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:24.977256 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:24.977219 2534 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 15:34:25.201767 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:25.201716 2534 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 15:29:23 +0000 UTC" deadline="2027-10-24 23:24:26.129399905 +0000 UTC" Apr 22 15:34:25.201767 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:25.201764 2534 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13207h50m0.927640355s" Apr 22 15:34:25.282084 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:25.281979 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:25.282253 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:25.282113 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:25.282803 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:25.282628 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:25.282803 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:25.282751 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:26.791784 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:26.791496 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:26.792255 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:26.791665 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:26.792255 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:26.791883 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:34:30.791860103 +0000 UTC m=+9.073013725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:26.892725 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:26.892681 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:26.892968 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:26.892901 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:26.892968 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:26.892920 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:26.892968 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:26.892935 2534 projected.go:194] Error preparing data for projected volume kube-api-access-xqz8g for pod openshift-network-diagnostics/network-check-target-h8d7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:26.893120 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:26.892992 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g podName:7febb925-9a97-4316-9acd-71100c471eb1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:30.892972839 +0000 UTC m=+9.174126473 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqz8g" (UniqueName: "kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g") pod "network-check-target-h8d7v" (UID: "7febb925-9a97-4316-9acd-71100c471eb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:27.282772 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:27.282085 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:27.282772 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:27.282206 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:27.282772 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:27.282615 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:27.282772 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:27.282726 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:29.282302 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:29.282260 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:29.282804 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:29.282401 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:29.282874 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:29.282822 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:29.282919 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:29.282894 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:30.827983 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:30.827950 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:30.828451 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:30.828118 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:30.828451 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:30.828202 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:34:38.82817896 +0000 UTC m=+17.109332602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:30.928590 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:30.928547 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:30.928769 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:30.928744 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:30.928840 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:30.928770 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:30.928840 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:30.928786 2534 projected.go:194] Error preparing data for projected volume kube-api-access-xqz8g for pod openshift-network-diagnostics/network-check-target-h8d7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:30.928941 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:30.928851 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g podName:7febb925-9a97-4316-9acd-71100c471eb1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:38.928833354 +0000 UTC m=+17.209986974 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqz8g" (UniqueName: "kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g") pod "network-check-target-h8d7v" (UID: "7febb925-9a97-4316-9acd-71100c471eb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:31.282941 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:31.282895 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:31.283146 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:31.283037 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:31.283422 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:31.283393 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:31.283578 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:31.283490 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:33.282753 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:33.282718 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:33.283197 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:33.282717 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:33.283197 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:33.282874 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:33.283197 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:33.282907 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:35.281971 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:35.281929 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:35.282422 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:35.281929 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:35.282422 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:35.282092 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:35.282422 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:35.282268 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:37.282698 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:37.282652 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:37.283170 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:37.282660 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:37.283170 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:37.282795 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:37.283170 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:37.282903 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:38.886949 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:38.886893 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:38.887463 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:38.887067 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:38.887463 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:38.887153 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:34:54.887129149 +0000 UTC m=+33.168282772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:38.987834 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:38.987791 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:38.988052 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:38.987994 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:38.988052 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:38.988021 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:38.988052 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:38.988035 2534 projected.go:194] Error preparing data for projected volume kube-api-access-xqz8g for pod openshift-network-diagnostics/network-check-target-h8d7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:38.988200 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:38.988105 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g podName:7febb925-9a97-4316-9acd-71100c471eb1 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:54.988084685 +0000 UTC m=+33.269238324 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqz8g" (UniqueName: "kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g") pod "network-check-target-h8d7v" (UID: "7febb925-9a97-4316-9acd-71100c471eb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:39.282939 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:39.282898 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:39.282939 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:39.282950 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:39.283197 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:39.283046 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:39.283197 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:39.283132 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:41.282440 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:41.282400 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:41.282990 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:41.282400 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:41.282990 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:41.282560 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:41.282990 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:41.282625 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:42.496436 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.496174 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" event={"ID":"93607452ba047e869102040d23558016","Type":"ContainerStarted","Data":"0a55920e61668a5e56ca970f0176cae6520bed148a389ad1d308fab1e39ad0b6"} Apr 22 15:34:42.500059 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.500023 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"7483cb604b4d387e3e147ea86d43ca014efe239580f642fb0c95221887831e4e"} Apr 22 15:34:42.500182 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.500064 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"9b72fc380204414172384e0bc56c917d53c4cbc48b4c95ea163d88f74b370f10"} Apr 22 15:34:42.500182 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.500078 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"5e49ad7ea2258c42b90f0046703659e712dc0e319034c0f9b95e850434c07993"} Apr 22 15:34:42.502231 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.502198 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vtcgb" event={"ID":"d870e76b-ada6-4b96-8ffa-57ad8f8da412","Type":"ContainerStarted","Data":"874930b938cee5fb06bf0160d425cbf11b5dea56707c10ee43875d0485466255"} Apr 22 15:34:42.504145 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.504045 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" event={"ID":"8781a272-57f8-42fd-84df-15814bf56a2a","Type":"ContainerStarted","Data":"9b5df33b0a9b9755a85151b0f423ce9476b1983db8bc4ed5f878f1a46cc665a7"} Apr 22 15:34:42.510036 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.509961 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-86.ec2.internal" podStartSLOduration=19.509944203 podStartE2EDuration="19.509944203s" podCreationTimestamp="2026-04-22 15:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:34:42.509651452 +0000 UTC m=+20.790805090" watchObservedRunningTime="2026-04-22 15:34:42.509944203 +0000 UTC m=+20.791097846" Apr 22 15:34:42.529815 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.529768 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-z7pk7" podStartSLOduration=1.9496148679999998 podStartE2EDuration="20.529749325s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.48949002 +0000 UTC m=+1.770643641" lastFinishedPulling="2026-04-22 15:34:42.069624477 +0000 UTC m=+20.350778098" observedRunningTime="2026-04-22 15:34:42.529671054 +0000 UTC m=+20.810824697" watchObservedRunningTime="2026-04-22 15:34:42.529749325 +0000 UTC m=+20.810902967" Apr 22 15:34:42.548599 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:42.548542 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vtcgb" podStartSLOduration=1.732128049 podStartE2EDuration="20.548505844s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.543098069 +0000 UTC m=+1.824251689" lastFinishedPulling="2026-04-22 15:34:42.359475863 +0000 UTC m=+20.640629484" observedRunningTime="2026-04-22 15:34:42.547790438 +0000 UTC m=+20.828944084" watchObservedRunningTime="2026-04-22 15:34:42.548505844 +0000 UTC m=+20.829659491" Apr 22 15:34:43.282968 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.282689 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:43.282968 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.282689 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:43.282968 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:43.282955 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:43.283170 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:43.283040 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:43.507464 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.507432 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l947q" event={"ID":"60d1d72d-9ad8-4148-82bc-8ae8873fe4c8","Type":"ContainerStarted","Data":"c74c25e45ac9989b98422a43645ffaa27d317bfc0ab556695058075a891b8dc0"} Apr 22 15:34:43.508906 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.508884 2534 generic.go:358] "Generic (PLEG): container finished" podID="79b905854054fe4837d8eeee581d56c0" containerID="d0f0c2d7b7bad7ebfe3c83b0729485db2a1334066966f6fb532a573c507e5086" exitCode=0 Apr 22 15:34:43.509020 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.508924 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" event={"ID":"79b905854054fe4837d8eeee581d56c0","Type":"ContainerDied","Data":"d0f0c2d7b7bad7ebfe3c83b0729485db2a1334066966f6fb532a573c507e5086"} Apr 22 15:34:43.513619 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.513604 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:34:43.514036 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.514009 2534 generic.go:358] "Generic (PLEG): container finished" podID="92d55e47-0b30-4f98-aff7-3b7325bb3839" containerID="9b72fc380204414172384e0bc56c917d53c4cbc48b4c95ea163d88f74b370f10" exitCode=1 Apr 22 15:34:43.514129 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.514078 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerDied","Data":"9b72fc380204414172384e0bc56c917d53c4cbc48b4c95ea163d88f74b370f10"} Apr 22 15:34:43.514129 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.514117 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"c0f5e37eb5e725eeb02fbca1cdc00bb1e74fbe460fb40e4bd3e6a49725075014"} Apr 22 15:34:43.514203 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.514131 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"1381c05df4dc2d6460d1cf55a63b05624849222307c527b194ea054fde3e881d"} Apr 22 15:34:43.514203 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.514144 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"408f25b39072c114113d8e1ce3925ffd995678c61791dfbd583506344de50af8"} Apr 22 15:34:43.515316 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.515290 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wbjzt" event={"ID":"1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0","Type":"ContainerStarted","Data":"3a134cf25236ae51657a7a65548814eb5d69c45b9ff16b57d6ed7578ff7b5569"} Apr 22 15:34:43.516482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.516460 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6vjfh" event={"ID":"95598f8a-db85-47f2-859f-d47efcdbfa09","Type":"ContainerStarted","Data":"8bc4d9b4ff70c93067ff2104888eda8962384667f1d73f6e712fcf8a74bed4fa"} Apr 22 15:34:43.517752 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.517720 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" event={"ID":"22adcb4d-4eb7-401a-8a80-b9d872a481d3","Type":"ContainerStarted","Data":"4fb4e84122755462baabe0230c4539aca42869623ca679b5f0c81d8c9bcd64a8"} Apr 22 15:34:43.518935 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.518913 2534 generic.go:358] "Generic (PLEG): container finished" podID="ca5aa4d2-6513-4631-aeb7-c9120934e117" containerID="d5cd2d38cb22aa8017f71e05a8d0b254ca2c906f7c68d87a421d35412f712e21" exitCode=0 Apr 22 15:34:43.519012 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.518977 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerDied","Data":"d5cd2d38cb22aa8017f71e05a8d0b254ca2c906f7c68d87a421d35412f712e21"} Apr 22 15:34:43.523833 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.523790 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l947q" podStartSLOduration=2.979092533 podStartE2EDuration="21.523775195s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.502495621 +0000 UTC m=+1.783649241" lastFinishedPulling="2026-04-22 15:34:42.047178269 +0000 UTC m=+20.328331903" observedRunningTime="2026-04-22 15:34:43.523348376 +0000 UTC m=+21.804502010" watchObservedRunningTime="2026-04-22 15:34:43.523775195 +0000 UTC m=+21.804928836" Apr 22 15:34:43.553115 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.553039 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6vjfh" podStartSLOduration=6.887099427 podStartE2EDuration="21.553021277s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.508158561 +0000 UTC m=+1.789312181" lastFinishedPulling="2026-04-22 15:34:38.174080396 +0000 UTC m=+16.455234031" observedRunningTime="2026-04-22 15:34:43.55069573 +0000 UTC m=+21.831849371" watchObservedRunningTime="2026-04-22 15:34:43.553021277 +0000 UTC m=+21.834174920" Apr 22 15:34:43.569114 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:43.569058 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wbjzt" podStartSLOduration=6.929670033 podStartE2EDuration="21.569040163s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.534707516 +0000 UTC m=+1.815861136" lastFinishedPulling="2026-04-22 15:34:38.174077639 +0000 UTC m=+16.455231266" observedRunningTime="2026-04-22 15:34:43.568580679 +0000 UTC m=+21.849734322" watchObservedRunningTime="2026-04-22 15:34:43.569040163 +0000 UTC m=+21.850193806" Apr 22 15:34:44.192598 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.192562 2534 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 15:34:44.232017 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.231903 2534 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T15:34:44.192589312Z","UUID":"3aa481eb-ccea-48a5-9311-fb830e776833","Handler":null,"Name":"","Endpoint":""} Apr 22 15:34:44.234389 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.234365 2534 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 15:34:44.234389 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.234394 2534 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 15:34:44.522997 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.522961 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" event={"ID":"79b905854054fe4837d8eeee581d56c0","Type":"ContainerStarted","Data":"fdd243be879f4b112452208474f9856cc2f8b039ddfb001e82902b5212c1bd0e"} Apr 22 15:34:44.524416 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.524382 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hdcgf" event={"ID":"f42d3546-8a90-41a3-a8b3-01565e6ed78c","Type":"ContainerStarted","Data":"6c37f8bb1c3f4d30049128b1eb45231e4358ba944c7847541af96f2ca60f6a52"} Apr 22 15:34:44.527099 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.526218 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" event={"ID":"22adcb4d-4eb7-401a-8a80-b9d872a481d3","Type":"ContainerStarted","Data":"2ac6196e0ca639dc814209de5a6dfb29f1eb07e115ace110153c5d1dfd85ce53"} Apr 22 15:34:44.538993 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.538945 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-86.ec2.internal" podStartSLOduration=21.538931151 podStartE2EDuration="21.538931151s" podCreationTimestamp="2026-04-22 15:34:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:34:44.538427021 +0000 UTC m=+22.819580666" watchObservedRunningTime="2026-04-22 15:34:44.538931151 +0000 UTC m=+22.820084793" Apr 22 15:34:44.554024 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:44.553972 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hdcgf" podStartSLOduration=4.055743709 podStartE2EDuration="22.553956465s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.548967183 +0000 UTC m=+1.830120803" lastFinishedPulling="2026-04-22 15:34:42.047179924 +0000 UTC m=+20.328333559" observedRunningTime="2026-04-22 15:34:44.553740675 +0000 UTC m=+22.834894318" watchObservedRunningTime="2026-04-22 15:34:44.553956465 +0000 UTC m=+22.835110106" Apr 22 15:34:45.158138 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.158099 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:45.158875 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.158859 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:45.282560 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.282447 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:45.282560 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.282444 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:45.282768 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:45.282597 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:45.282768 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:45.282649 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:45.531145 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.531113 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:34:45.531584 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.531549 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"f70bb0a4df6744a8bd5c116ffe01706fa9c5d1f1fb610f972cf9c7c4a44cd17c"} Apr 22 15:34:45.533889 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.533809 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" event={"ID":"22adcb4d-4eb7-401a-8a80-b9d872a481d3","Type":"ContainerStarted","Data":"b3b95e4c6e65e52934cbd9c8989b8d48bf6d7987429c8d4c4c0a2692e1c48686"} Apr 22 15:34:45.534051 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.534034 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:45.534734 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.534711 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6vjfh" Apr 22 15:34:45.554925 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:45.554871 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mrcgh" podStartSLOduration=2.052705916 podStartE2EDuration="23.554850439s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.471991498 +0000 UTC m=+1.753145118" lastFinishedPulling="2026-04-22 15:34:44.974136019 +0000 UTC m=+23.255289641" observedRunningTime="2026-04-22 15:34:45.554420872 +0000 UTC m=+23.835574513" watchObservedRunningTime="2026-04-22 15:34:45.554850439 +0000 UTC m=+23.836004081" Apr 22 15:34:47.282041 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:47.282000 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:47.282466 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:47.282000 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:47.282466 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:47.282161 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:47.282466 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:47.282204 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:48.542927 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:48.542751 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:34:48.543655 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:48.543165 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"f46caa07490d229730553514196c682f721a20fcdf7d95f325f3f10e29ebf7d0"} Apr 22 15:34:48.543655 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:48.543489 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:48.543655 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:48.543516 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:48.543774 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:48.543660 2534 scope.go:117] "RemoveContainer" containerID="9b72fc380204414172384e0bc56c917d53c4cbc48b4c95ea163d88f74b370f10" Apr 22 15:34:48.559605 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:48.559581 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:49.282197 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.282161 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:49.282409 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.282161 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:49.282409 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:49.282277 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:49.282409 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:49.282361 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:49.550389 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.550307 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:34:49.550866 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.550737 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" event={"ID":"92d55e47-0b30-4f98-aff7-3b7325bb3839","Type":"ContainerStarted","Data":"2cebede8739ce3d8ee128587069fa1db627b6278bc289fbb018be7728fb1693e"} Apr 22 15:34:49.551962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.551933 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:49.554517 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.554478 2534 generic.go:358] "Generic (PLEG): container finished" podID="ca5aa4d2-6513-4631-aeb7-c9120934e117" containerID="057d6f4c85bc911749ed7e94ffc13903c1fd7c2985e6c006731582e8d481b7e0" exitCode=0 Apr 22 15:34:49.554720 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.554518 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerDied","Data":"057d6f4c85bc911749ed7e94ffc13903c1fd7c2985e6c006731582e8d481b7e0"} Apr 22 15:34:49.574733 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.574515 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:34:49.613488 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:49.613431 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" podStartSLOduration=9.069044831 podStartE2EDuration="27.613416621s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.554886575 +0000 UTC m=+1.836040199" lastFinishedPulling="2026-04-22 15:34:42.099258369 +0000 UTC m=+20.380411989" observedRunningTime="2026-04-22 15:34:49.611740755 +0000 UTC m=+27.892894420" watchObservedRunningTime="2026-04-22 15:34:49.613416621 +0000 UTC m=+27.894570263" Apr 22 15:34:50.247254 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:50.247219 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-82tqk"] Apr 22 15:34:50.247476 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:50.247373 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:50.247564 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:50.247491 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:50.247756 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:50.247719 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h8d7v"] Apr 22 15:34:50.247904 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:50.247842 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:50.248166 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:50.247941 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:50.558433 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:50.558402 2534 generic.go:358] "Generic (PLEG): container finished" podID="ca5aa4d2-6513-4631-aeb7-c9120934e117" containerID="af8247ee6a91778f5299c8b5d8bf9488d9d9daea72f0384639da01411654a2ef" exitCode=0 Apr 22 15:34:50.558898 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:50.558473 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerDied","Data":"af8247ee6a91778f5299c8b5d8bf9488d9d9daea72f0384639da01411654a2ef"} Apr 22 15:34:51.566428 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:51.566390 2534 generic.go:358] "Generic (PLEG): container finished" podID="ca5aa4d2-6513-4631-aeb7-c9120934e117" containerID="4fbe9659e9a21d0ddbb407f6e25fe982b9f2243bf3f86073452a611a0ab8cf95" exitCode=0 Apr 22 15:34:51.566911 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:51.566487 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerDied","Data":"4fbe9659e9a21d0ddbb407f6e25fe982b9f2243bf3f86073452a611a0ab8cf95"} Apr 22 15:34:52.284029 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:52.283807 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:52.284213 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:52.284115 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:52.284482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:52.283881 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:52.284606 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:52.284588 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:54.281883 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:54.281851 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:54.282281 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:54.281854 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:54.282281 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:54.281988 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-h8d7v" podUID="7febb925-9a97-4316-9acd-71100c471eb1" Apr 22 15:34:54.282281 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:54.282065 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-82tqk" podUID="c2e72f84-cb38-472e-abba-c2f44adaf2fd" Apr 22 15:34:54.906743 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:54.906707 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:54.906918 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:54.906818 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:54.906918 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:54.906870 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:35:26.906854423 +0000 UTC m=+65.188008043 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 15:34:55.004111 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.004085 2534 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-86.ec2.internal" event="NodeReady" Apr 22 15:34:55.004302 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.004218 2534 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 15:34:55.007435 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.007409 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:55.007585 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.007571 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 15:34:55.007637 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.007591 2534 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 15:34:55.007637 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.007600 2534 projected.go:194] Error preparing data for projected volume kube-api-access-xqz8g for pod openshift-network-diagnostics/network-check-target-h8d7v: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:55.007715 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.007644 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g podName:7febb925-9a97-4316-9acd-71100c471eb1 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:27.007632652 +0000 UTC m=+65.288786272 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-xqz8g" (UniqueName: "kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g") pod "network-check-target-h8d7v" (UID: "7febb925-9a97-4316-9acd-71100c471eb1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 15:34:55.057182 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.057141 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-cqvj8"] Apr 22 15:34:55.077824 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.077779 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-fshk9"] Apr 22 15:34:55.078146 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.078119 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.081154 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.081122 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 15:34:55.081296 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.081159 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgzsv\"" Apr 22 15:34:55.086926 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.086901 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 15:34:55.087155 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.087137 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fshk9"] Apr 22 15:34:55.087220 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.087161 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cqvj8"] Apr 22 15:34:55.087274 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.087261 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:55.089868 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.089843 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 15:34:55.089868 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.089866 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fpsm4\"" Apr 22 15:34:55.090063 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.089883 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 15:34:55.090063 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.089866 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 15:34:55.208767 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.208724 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6srxd\" (UniqueName: \"kubernetes.io/projected/aec47240-1952-48af-b917-7fdd3074710a-kube-api-access-6srxd\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.208975 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.208868 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aec47240-1952-48af-b917-7fdd3074710a-tmp-dir\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.208975 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.208937 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.209087 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.209000 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:55.209087 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.209017 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aec47240-1952-48af-b917-7fdd3074710a-config-volume\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.209087 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.209040 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jhxk\" (UniqueName: \"kubernetes.io/projected/b91d8a6d-a426-4263-ae63-99ecd3ff6949-kube-api-access-5jhxk\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:55.309410 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.309372 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aec47240-1952-48af-b917-7fdd3074710a-tmp-dir\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.309410 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.309415 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.309462 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.309486 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aec47240-1952-48af-b917-7fdd3074710a-config-volume\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.309511 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jhxk\" (UniqueName: \"kubernetes.io/projected/b91d8a6d-a426-4263-ae63-99ecd3ff6949-kube-api-access-5jhxk\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.309565 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6srxd\" (UniqueName: \"kubernetes.io/projected/aec47240-1952-48af-b917-7fdd3074710a-kube-api-access-6srxd\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.309652 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.309739 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:34:55.809715638 +0000 UTC m=+34.090869277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.309651 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.309787 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aec47240-1952-48af-b917-7fdd3074710a-tmp-dir\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.310101 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.309844 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:55.809822653 +0000 UTC m=+34.090976273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:34:55.310419 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.310202 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/aec47240-1952-48af-b917-7fdd3074710a-config-volume\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.324137 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.324105 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6srxd\" (UniqueName: \"kubernetes.io/projected/aec47240-1952-48af-b917-7fdd3074710a-kube-api-access-6srxd\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.324137 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.324122 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jhxk\" (UniqueName: \"kubernetes.io/projected/b91d8a6d-a426-4263-ae63-99ecd3ff6949-kube-api-access-5jhxk\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:55.814224 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.814181 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:55.814488 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:55.814251 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:55.814488 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.814370 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:55.814488 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.814382 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:55.814488 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.814438 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:34:56.81442472 +0000 UTC m=+35.095578340 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:34:55.814488 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:55.814454 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:56.814446607 +0000 UTC m=+35.095600227 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:34:56.282247 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.282207 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:34:56.282440 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.282253 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:34:56.286421 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.286260 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xnsfv\"" Apr 22 15:34:56.286421 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.286259 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:34:56.286668 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.286260 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxbwk\"" Apr 22 15:34:56.286668 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.286269 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:34:56.286668 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.286269 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:34:56.823102 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.823061 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:56.823857 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:56.823159 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:56.823857 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:56.823233 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:56.823857 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:56.823304 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:56.823857 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:56.823321 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:34:58.823299619 +0000 UTC m=+37.104453253 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:34:56.823857 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:56.823353 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:34:58.823337824 +0000 UTC m=+37.104491446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:34:58.582188 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:58.582150 2534 generic.go:358] "Generic (PLEG): container finished" podID="ca5aa4d2-6513-4631-aeb7-c9120934e117" containerID="486b705749a7c69f2710b609f54f7e6f5c51860677aa903eda8f549914862f21" exitCode=0 Apr 22 15:34:58.582747 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:58.582198 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerDied","Data":"486b705749a7c69f2710b609f54f7e6f5c51860677aa903eda8f549914862f21"} Apr 22 15:34:58.839843 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:58.839751 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:34:58.839843 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:58.839817 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:34:58.840051 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:58.839897 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:34:58.840051 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:58.839920 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:34:58.840051 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:58.839963 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:35:02.839948779 +0000 UTC m=+41.121102398 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:34:58.840051 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:34:58.839978 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:02.839972474 +0000 UTC m=+41.121126094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:34:59.587255 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:59.587220 2534 generic.go:358] "Generic (PLEG): container finished" podID="ca5aa4d2-6513-4631-aeb7-c9120934e117" containerID="84ba3e3b01ef52ccb9723cd88210be4e9d5eefb18599c9da9bfac9ae4e53d6a7" exitCode=0 Apr 22 15:34:59.587675 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:34:59.587272 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerDied","Data":"84ba3e3b01ef52ccb9723cd88210be4e9d5eefb18599c9da9bfac9ae4e53d6a7"} Apr 22 15:35:00.592057 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:00.591777 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" event={"ID":"ca5aa4d2-6513-4631-aeb7-c9120934e117","Type":"ContainerStarted","Data":"8ba55c1f2e4bd481963dd74e704184a14e99be2294e4a591ed31fa9738a129f7"} Apr 22 15:35:00.615735 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:00.615689 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7b4hc" podStartSLOduration=4.570120342 podStartE2EDuration="38.615673221s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:34:23.540433421 +0000 UTC m=+1.821587056" lastFinishedPulling="2026-04-22 15:34:57.5859863 +0000 UTC m=+35.867139935" observedRunningTime="2026-04-22 15:35:00.61447009 +0000 UTC m=+38.895623729" watchObservedRunningTime="2026-04-22 15:35:00.615673221 +0000 UTC m=+38.896826862" Apr 22 15:35:02.867441 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:02.867389 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:35:02.867923 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:02.867567 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:35:02.867923 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:02.867627 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:35:02.867923 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:02.867638 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:10.867622043 +0000 UTC m=+49.148775663 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:35:02.867923 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:02.867683 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:35:10.867669353 +0000 UTC m=+49.148822972 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:35:02.868198 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:02.867485 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:35:10.923691 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:10.923648 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:35:10.924076 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:10.923711 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:35:10.924076 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:10.923794 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:35:10.924076 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:10.923801 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:35:10.924076 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:10.923846 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:35:26.923830698 +0000 UTC m=+65.204984318 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:35:10.924076 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:10.923859 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:26.923852942 +0000 UTC m=+65.205006562 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:35:21.577990 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:21.577963 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fpdtl" Apr 22 15:35:26.930050 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:26.930002 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:35:26.930050 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:26.930051 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:35:26.930612 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:26.930079 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:35:26.930612 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:26.930155 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:35:26.930612 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:26.930158 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:35:26.930612 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:26.930219 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:58.9302047 +0000 UTC m=+97.211358323 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:35:26.930612 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:26.930233 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:35:58.930226381 +0000 UTC m=+97.211380001 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:35:26.932969 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:26.932946 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 15:35:26.940708 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:26.940675 2534 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 15:35:26.940843 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:26.940777 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs podName:c2e72f84-cb38-472e-abba-c2f44adaf2fd nodeName:}" failed. No retries permitted until 2026-04-22 15:36:30.94075699 +0000 UTC m=+129.221910611 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs") pod "network-metrics-daemon-82tqk" (UID: "c2e72f84-cb38-472e-abba-c2f44adaf2fd") : secret "metrics-daemon-secret" not found Apr 22 15:35:27.031391 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.031355 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:35:27.034617 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.034585 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 15:35:27.044625 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.044582 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 15:35:27.055783 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.055751 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqz8g\" (UniqueName: \"kubernetes.io/projected/7febb925-9a97-4316-9acd-71100c471eb1-kube-api-access-xqz8g\") pod \"network-check-target-h8d7v\" (UID: \"7febb925-9a97-4316-9acd-71100c471eb1\") " pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:35:27.203865 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.203782 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-xnsfv\"" Apr 22 15:35:27.211029 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.211007 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:35:27.392221 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.392188 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-h8d7v"] Apr 22 15:35:27.395705 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:35:27.395676 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7febb925_9a97_4316_9acd_71100c471eb1.slice/crio-85148df0139ac9fddaab1cea8eb11c7f8ddf3273bc8a1d3e076ff8fe6269d838 WatchSource:0}: Error finding container 85148df0139ac9fddaab1cea8eb11c7f8ddf3273bc8a1d3e076ff8fe6269d838: Status 404 returned error can't find the container with id 85148df0139ac9fddaab1cea8eb11c7f8ddf3273bc8a1d3e076ff8fe6269d838 Apr 22 15:35:27.643063 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:27.642976 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h8d7v" event={"ID":"7febb925-9a97-4316-9acd-71100c471eb1","Type":"ContainerStarted","Data":"85148df0139ac9fddaab1cea8eb11c7f8ddf3273bc8a1d3e076ff8fe6269d838"} Apr 22 15:35:30.650631 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:30.650535 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-h8d7v" event={"ID":"7febb925-9a97-4316-9acd-71100c471eb1","Type":"ContainerStarted","Data":"dc42fbfd487b0cd9f1cafa78e2bbba743693270a8b24354834b64077dabdd022"} Apr 22 15:35:30.651016 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:30.650662 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:35:30.670114 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:30.670064 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-h8d7v" podStartSLOduration=65.757724096 podStartE2EDuration="1m8.670050324s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:35:27.397497072 +0000 UTC m=+65.678650696" lastFinishedPulling="2026-04-22 15:35:30.309823304 +0000 UTC m=+68.590976924" observedRunningTime="2026-04-22 15:35:30.669420487 +0000 UTC m=+68.950574128" watchObservedRunningTime="2026-04-22 15:35:30.670050324 +0000 UTC m=+68.951203991" Apr 22 15:35:54.621157 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.621033 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d45rg"] Apr 22 15:35:54.625066 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.625040 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.627775 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.627751 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 22 15:35:54.627949 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.627799 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 22 15:35:54.627949 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.627801 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 15:35:54.627949 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.627882 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 15:35:54.627949 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.627803 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-wx2vb\"" Apr 22 15:35:54.632832 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.632812 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 22 15:35:54.635455 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.635428 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d45rg"] Apr 22 15:35:54.719014 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.718977 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd"] Apr 22 15:35:54.721472 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.721448 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-tmp\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.721601 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.721488 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.721743 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.721714 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-snapshots\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.721868 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.721810 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.721868 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.721809 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btggc\" (UniqueName: \"kubernetes.io/projected/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-kube-api-access-btggc\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.721975 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.721935 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-service-ca-bundle\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.721975 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.721960 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-serving-cert\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.722656 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.722637 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-779cbd5446-jz78p"] Apr 22 15:35:54.724606 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.724584 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 15:35:54.724758 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.724737 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 15:35:54.724811 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.724755 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 15:35:54.724811 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.724765 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 15:35:54.724811 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.724756 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5lb5p\"" Apr 22 15:35:54.725259 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.725245 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.727721 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.727689 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 15:35:54.727837 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.727728 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kdqwq\"" Apr 22 15:35:54.727837 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.727808 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 15:35:54.727938 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.727811 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 15:35:54.727989 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.727811 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 15:35:54.728040 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.727988 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 15:35:54.728040 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.728000 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 15:35:54.729549 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.729513 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd"] Apr 22 15:35:54.736122 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.736028 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-779cbd5446-jz78p"] Apr 22 15:35:54.822312 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822268 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-tmp\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.822312 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822315 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.822627 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822507 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7575w\" (UniqueName: \"kubernetes.io/projected/7d4bedbf-e56a-428e-be2d-5f1a5111951f-kube-api-access-7575w\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.822627 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822616 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.822706 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822657 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-snapshots\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.822706 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822688 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-stats-auth\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.822782 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822730 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-tmp\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.822782 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822747 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-default-certificate\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.822840 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822786 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9tp6\" (UniqueName: \"kubernetes.io/projected/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-kube-api-access-l9tp6\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.822840 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822805 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.822840 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822831 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btggc\" (UniqueName: \"kubernetes.io/projected/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-kube-api-access-btggc\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.822929 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822870 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-service-ca-bundle\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.822929 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822885 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-serving-cert\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.822929 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822901 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.822929 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.822918 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d4bedbf-e56a-428e-be2d-5f1a5111951f-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.823283 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.823261 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-snapshots\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.823507 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.823486 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-service-ca-bundle\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.823825 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.823804 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.825337 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.825316 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-serving-cert\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.831884 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.831861 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btggc\" (UniqueName: \"kubernetes.io/projected/4269d0dd-d09b-4927-96de-1b3ab59b5ec7-kube-api-access-btggc\") pod \"insights-operator-585dfdc468-d45rg\" (UID: \"4269d0dd-d09b-4927-96de-1b3ab59b5ec7\") " pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:54.923956 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.923917 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7575w\" (UniqueName: \"kubernetes.io/projected/7d4bedbf-e56a-428e-be2d-5f1a5111951f-kube-api-access-7575w\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.924135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.923977 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.924135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.924016 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-stats-auth\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.924135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.924084 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-default-certificate\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.924135 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.924120 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9tp6\" (UniqueName: \"kubernetes.io/projected/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-kube-api-access-l9tp6\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.924341 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.924151 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.924341 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.924222 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.924341 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.924252 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d4bedbf-e56a-428e-be2d-5f1a5111951f-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.924341 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:54.924330 2534 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:35:54.924553 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:54.924367 2534 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:54.924553 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:54.924407 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:55.424381393 +0000 UTC m=+93.705535027 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : configmap references non-existent config key: service-ca.crt Apr 22 15:35:54.924553 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:54.924435 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls podName:7d4bedbf-e56a-428e-be2d-5f1a5111951f nodeName:}" failed. No retries permitted until 2026-04-22 15:35:55.424419015 +0000 UTC m=+93.705572635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qdtnd" (UID: "7d4bedbf-e56a-428e-be2d-5f1a5111951f") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:54.924553 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:54.924454 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:55.424444556 +0000 UTC m=+93.705598176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : secret "router-metrics-certs-default" not found Apr 22 15:35:54.925044 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.925022 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/7d4bedbf-e56a-428e-be2d-5f1a5111951f-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.927071 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.927048 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-stats-auth\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.927156 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.927111 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-default-certificate\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.933382 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.933350 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7575w\" (UniqueName: \"kubernetes.io/projected/7d4bedbf-e56a-428e-be2d-5f1a5111951f-kube-api-access-7575w\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:54.933475 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.933410 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9tp6\" (UniqueName: \"kubernetes.io/projected/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-kube-api-access-l9tp6\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:54.934562 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:54.934548 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-d45rg" Apr 22 15:35:55.057168 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:55.057126 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-d45rg"] Apr 22 15:35:55.060033 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:35:55.060006 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4269d0dd_d09b_4927_96de_1b3ab59b5ec7.slice/crio-39a2c35312f684455e92eb92aa30f718360cd956c453acda18ae57309d2f3dfa WatchSource:0}: Error finding container 39a2c35312f684455e92eb92aa30f718360cd956c453acda18ae57309d2f3dfa: Status 404 returned error can't find the container with id 39a2c35312f684455e92eb92aa30f718360cd956c453acda18ae57309d2f3dfa Apr 22 15:35:55.428718 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:55.428684 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:55.428888 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:55.428742 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:55.428888 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:55.428810 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:55.428888 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:55.428860 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:56.428840117 +0000 UTC m=+94.709993757 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : configmap references non-existent config key: service-ca.crt Apr 22 15:35:55.429047 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:55.428907 2534 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:35:55.429047 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:55.428907 2534 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:55.429047 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:55.428958 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:56.428949126 +0000 UTC m=+94.710102745 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : secret "router-metrics-certs-default" not found Apr 22 15:35:55.429047 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:55.428971 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls podName:7d4bedbf-e56a-428e-be2d-5f1a5111951f nodeName:}" failed. No retries permitted until 2026-04-22 15:35:56.42896485 +0000 UTC m=+94.710118470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qdtnd" (UID: "7d4bedbf-e56a-428e-be2d-5f1a5111951f") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:55.697788 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:55.697699 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d45rg" event={"ID":"4269d0dd-d09b-4927-96de-1b3ab59b5ec7","Type":"ContainerStarted","Data":"39a2c35312f684455e92eb92aa30f718360cd956c453acda18ae57309d2f3dfa"} Apr 22 15:35:56.437076 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:56.437031 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:56.437306 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:56.437095 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:56.437306 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:56.437139 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:56.437306 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:56.437206 2534 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:56.437306 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:56.437247 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:58.437232636 +0000 UTC m=+96.718386256 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : configmap references non-existent config key: service-ca.crt Apr 22 15:35:56.437306 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:56.437254 2534 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:35:56.437306 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:56.437271 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls podName:7d4bedbf-e56a-428e-be2d-5f1a5111951f nodeName:}" failed. No retries permitted until 2026-04-22 15:35:58.437254588 +0000 UTC m=+96.718408210 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qdtnd" (UID: "7d4bedbf-e56a-428e-be2d-5f1a5111951f") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:56.437306 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:56.437302 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:35:58.437287631 +0000 UTC m=+96.718441251 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : secret "router-metrics-certs-default" not found Apr 22 15:35:58.457256 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:58.457212 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:35:58.457725 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:58.457279 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:58.457725 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:58.457325 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:35:58.457725 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.457370 2534 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:58.457725 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.457440 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:02.457422823 +0000 UTC m=+100.738576464 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : configmap references non-existent config key: service-ca.crt Apr 22 15:35:58.457725 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.457446 2534 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:35:58.457725 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.457456 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls podName:7d4bedbf-e56a-428e-be2d-5f1a5111951f nodeName:}" failed. No retries permitted until 2026-04-22 15:36:02.457449282 +0000 UTC m=+100.738602901 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qdtnd" (UID: "7d4bedbf-e56a-428e-be2d-5f1a5111951f") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:35:58.457725 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.457507 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:02.457485085 +0000 UTC m=+100.738638713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : secret "router-metrics-certs-default" not found Apr 22 15:35:58.705654 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:58.705616 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d45rg" event={"ID":"4269d0dd-d09b-4927-96de-1b3ab59b5ec7","Type":"ContainerStarted","Data":"20eed9b73a36b05843e6abb2a9bb5f27fb5edd4a43a31be6c97401016260e5de"} Apr 22 15:35:58.722512 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:58.722409 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-d45rg" podStartSLOduration=2.036270932 podStartE2EDuration="4.722393765s" podCreationTimestamp="2026-04-22 15:35:54 +0000 UTC" firstStartedPulling="2026-04-22 15:35:55.06189665 +0000 UTC m=+93.343050274" lastFinishedPulling="2026-04-22 15:35:57.748019473 +0000 UTC m=+96.029173107" observedRunningTime="2026-04-22 15:35:58.721998791 +0000 UTC m=+97.003152436" watchObservedRunningTime="2026-04-22 15:35:58.722393765 +0000 UTC m=+97.003547428" Apr 22 15:35:58.962499 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:58.962464 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:35:58.962724 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:35:58.962545 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:35:58.962724 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.962632 2534 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 15:35:58.962724 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.962643 2534 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 15:35:58.962724 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.962693 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert podName:b91d8a6d-a426-4263-ae63-99ecd3ff6949 nodeName:}" failed. No retries permitted until 2026-04-22 15:37:02.962680022 +0000 UTC m=+161.243833642 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert") pod "ingress-canary-fshk9" (UID: "b91d8a6d-a426-4263-ae63-99ecd3ff6949") : secret "canary-serving-cert" not found Apr 22 15:35:58.962724 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:35:58.962706 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls podName:aec47240-1952-48af-b917-7fdd3074710a nodeName:}" failed. No retries permitted until 2026-04-22 15:37:02.962699395 +0000 UTC m=+161.243853015 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls") pod "dns-default-cqvj8" (UID: "aec47240-1952-48af-b917-7fdd3074710a") : secret "dns-default-metrics-tls" not found Apr 22 15:36:01.425784 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:01.425755 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wbjzt_1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0/dns-node-resolver/0.log" Apr 22 15:36:01.655446 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:01.655408 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-h8d7v" Apr 22 15:36:02.026141 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:02.026121 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l947q_60d1d72d-9ad8-4148-82bc-8ae8873fe4c8/node-ca/0.log" Apr 22 15:36:02.493494 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:02.493450 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:02.493959 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:02.493552 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:02.493959 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:02.493592 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:36:02.493959 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:02.493629 2534 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:36:02.493959 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:02.493706 2534 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:36:02.493959 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:02.493715 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:10.49369615 +0000 UTC m=+108.774849784 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : configmap references non-existent config key: service-ca.crt Apr 22 15:36:02.493959 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:02.493732 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:10.493725087 +0000 UTC m=+108.774878706 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : secret "router-metrics-certs-default" not found Apr 22 15:36:02.493959 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:02.493767 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls podName:7d4bedbf-e56a-428e-be2d-5f1a5111951f nodeName:}" failed. No retries permitted until 2026-04-22 15:36:10.493750161 +0000 UTC m=+108.774903799 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qdtnd" (UID: "7d4bedbf-e56a-428e-be2d-5f1a5111951f") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:36:04.623981 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.623944 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9"] Apr 22 15:36:04.630277 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.630241 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.633755 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.633729 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 22 15:36:04.638707 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.638684 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 22 15:36:04.639328 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.639309 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 22 15:36:04.639657 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.639639 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 22 15:36:04.639811 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.639792 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-vxrb7\"" Apr 22 15:36:04.648014 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.647987 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9"] Apr 22 15:36:04.709354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.709303 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.709584 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.709376 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnl8k\" (UniqueName: \"kubernetes.io/projected/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-kube-api-access-mnl8k\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.709584 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.709475 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.810051 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.810009 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.810051 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.810060 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.810314 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.810111 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mnl8k\" (UniqueName: \"kubernetes.io/projected/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-kube-api-access-mnl8k\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.810782 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.810756 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.812586 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.812557 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.820267 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.820227 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnl8k\" (UniqueName: \"kubernetes.io/projected/6fc37a4f-3b0b-435c-a8c6-ceab191ad796-kube-api-access-mnl8k\") pod \"kube-storage-version-migrator-operator-6769c5d45-gp7j9\" (UID: \"6fc37a4f-3b0b-435c-a8c6-ceab191ad796\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:04.939304 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:04.939271 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" Apr 22 15:36:05.069943 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:05.069910 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9"] Apr 22 15:36:05.073899 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:05.073872 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fc37a4f_3b0b_435c_a8c6_ceab191ad796.slice/crio-4739f5fb823296aa701cd953f643120690e649dd0dd83e7570cd5ba06cec7f2f WatchSource:0}: Error finding container 4739f5fb823296aa701cd953f643120690e649dd0dd83e7570cd5ba06cec7f2f: Status 404 returned error can't find the container with id 4739f5fb823296aa701cd953f643120690e649dd0dd83e7570cd5ba06cec7f2f Apr 22 15:36:05.721770 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:05.721732 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" event={"ID":"6fc37a4f-3b0b-435c-a8c6-ceab191ad796","Type":"ContainerStarted","Data":"4739f5fb823296aa701cd953f643120690e649dd0dd83e7570cd5ba06cec7f2f"} Apr 22 15:36:07.727345 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:07.727260 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" event={"ID":"6fc37a4f-3b0b-435c-a8c6-ceab191ad796","Type":"ContainerStarted","Data":"5637f15d0dc83cf6b22a30eba023801c3a3a14de14210e52f9d40e9bd3fa7c7f"} Apr 22 15:36:07.746056 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:07.745997 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" podStartSLOduration=1.365255935 podStartE2EDuration="3.745981389s" podCreationTimestamp="2026-04-22 15:36:04 +0000 UTC" firstStartedPulling="2026-04-22 15:36:05.075479625 +0000 UTC m=+103.356633245" lastFinishedPulling="2026-04-22 15:36:07.456205076 +0000 UTC m=+105.737358699" observedRunningTime="2026-04-22 15:36:07.745250289 +0000 UTC m=+106.026403931" watchObservedRunningTime="2026-04-22 15:36:07.745981389 +0000 UTC m=+106.027135030" Apr 22 15:36:08.752833 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.752800 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q"] Apr 22 15:36:08.755981 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.755962 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" Apr 22 15:36:08.759799 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.759776 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 15:36:08.760493 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.760473 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 15:36:08.771252 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.771221 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-hlzzh\"" Apr 22 15:36:08.797643 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.797609 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q"] Apr 22 15:36:08.845370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.845328 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jppsk\" (UniqueName: \"kubernetes.io/projected/3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9-kube-api-access-jppsk\") pod \"migrator-74bb7799d9-zfv4q\" (UID: \"3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" Apr 22 15:36:08.946643 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.946584 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jppsk\" (UniqueName: \"kubernetes.io/projected/3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9-kube-api-access-jppsk\") pod \"migrator-74bb7799d9-zfv4q\" (UID: \"3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" Apr 22 15:36:08.956013 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:08.955975 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jppsk\" (UniqueName: \"kubernetes.io/projected/3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9-kube-api-access-jppsk\") pod \"migrator-74bb7799d9-zfv4q\" (UID: \"3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" Apr 22 15:36:09.064967 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:09.064863 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" Apr 22 15:36:09.192106 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:09.192066 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q"] Apr 22 15:36:09.195135 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:09.195097 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ea1ff9e_4077_42b1_a6ea_bfe4d96c42e9.slice/crio-300dffdd64bb18b21be0969c929d8b2cc526a471c2a041bdc089a9ccd9746f2f WatchSource:0}: Error finding container 300dffdd64bb18b21be0969c929d8b2cc526a471c2a041bdc089a9ccd9746f2f: Status 404 returned error can't find the container with id 300dffdd64bb18b21be0969c929d8b2cc526a471c2a041bdc089a9ccd9746f2f Apr 22 15:36:09.732191 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:09.732140 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" event={"ID":"3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9","Type":"ContainerStarted","Data":"300dffdd64bb18b21be0969c929d8b2cc526a471c2a041bdc089a9ccd9746f2f"} Apr 22 15:36:10.560330 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:10.560289 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:10.560881 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:10.560393 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:10.560881 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:10.560437 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:36:10.560881 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:10.560449 2534 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 15:36:10.560881 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:10.560540 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:26.560503971 +0000 UTC m=+124.841657594 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : secret "router-metrics-certs-default" not found Apr 22 15:36:10.560881 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:10.560543 2534 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 15:36:10.560881 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:10.560592 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls podName:7d4bedbf-e56a-428e-be2d-5f1a5111951f nodeName:}" failed. No retries permitted until 2026-04-22 15:36:26.560580077 +0000 UTC m=+124.841733697 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-qdtnd" (UID: "7d4bedbf-e56a-428e-be2d-5f1a5111951f") : secret "cluster-monitoring-operator-tls" not found Apr 22 15:36:10.560881 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:10.560629 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle podName:e1977e29-cbfa-4dfa-910e-a7f5d173c2e9 nodeName:}" failed. No retries permitted until 2026-04-22 15:36:26.560605922 +0000 UTC m=+124.841759549 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle") pod "router-default-779cbd5446-jz78p" (UID: "e1977e29-cbfa-4dfa-910e-a7f5d173c2e9") : configmap references non-existent config key: service-ca.crt Apr 22 15:36:11.738332 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:11.738299 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" event={"ID":"3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9","Type":"ContainerStarted","Data":"1bb039c56dd686ba2c37e9ada6f611809753f86d2f2ad82d1528d7e41a3422f5"} Apr 22 15:36:11.738332 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:11.738337 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" event={"ID":"3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9","Type":"ContainerStarted","Data":"6f2b6c7b757a3b69b3c94521f37aca1c6c79e5ae0893eea72be083dd37084393"} Apr 22 15:36:11.759108 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:11.759052 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-zfv4q" podStartSLOduration=2.267127237 podStartE2EDuration="3.759033103s" podCreationTimestamp="2026-04-22 15:36:08 +0000 UTC" firstStartedPulling="2026-04-22 15:36:09.197080882 +0000 UTC m=+107.478234519" lastFinishedPulling="2026-04-22 15:36:10.688986766 +0000 UTC m=+108.970140385" observedRunningTime="2026-04-22 15:36:11.758078208 +0000 UTC m=+110.039231852" watchObservedRunningTime="2026-04-22 15:36:11.759033103 +0000 UTC m=+110.040186744" Apr 22 15:36:26.591098 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.591054 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:26.591098 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.591106 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:36:26.591665 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.591236 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:26.591713 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.591692 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-service-ca-bundle\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:26.593701 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.593677 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e1977e29-cbfa-4dfa-910e-a7f5d173c2e9-metrics-certs\") pod \"router-default-779cbd5446-jz78p\" (UID: \"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9\") " pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:26.593701 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.593696 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4bedbf-e56a-428e-be2d-5f1a5111951f-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-qdtnd\" (UID: \"7d4bedbf-e56a-428e-be2d-5f1a5111951f\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:36:26.836752 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.836718 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-5lb5p\"" Apr 22 15:36:26.842664 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.842604 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-kdqwq\"" Apr 22 15:36:26.844658 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.844638 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" Apr 22 15:36:26.849077 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.849044 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:26.979157 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.979121 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd"] Apr 22 15:36:26.982089 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:26.982063 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4bedbf_e56a_428e_be2d_5f1a5111951f.slice/crio-063ffdc55abe1cf730618615e5d3445943ccbf36f83698267d1b8d8e84dce398 WatchSource:0}: Error finding container 063ffdc55abe1cf730618615e5d3445943ccbf36f83698267d1b8d8e84dce398: Status 404 returned error can't find the container with id 063ffdc55abe1cf730618615e5d3445943ccbf36f83698267d1b8d8e84dce398 Apr 22 15:36:26.998174 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:26.998147 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-779cbd5446-jz78p"] Apr 22 15:36:27.001331 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:27.001304 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1977e29_cbfa_4dfa_910e_a7f5d173c2e9.slice/crio-c0c01df868e7c567267ca897c72460442c06157af1354f0af8c1cea29389a407 WatchSource:0}: Error finding container c0c01df868e7c567267ca897c72460442c06157af1354f0af8c1cea29389a407: Status 404 returned error can't find the container with id c0c01df868e7c567267ca897c72460442c06157af1354f0af8c1cea29389a407 Apr 22 15:36:27.777222 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:27.777183 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-779cbd5446-jz78p" event={"ID":"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9","Type":"ContainerStarted","Data":"7422de8520c20fe965accd77dcd7e146027eced49717f5f43b4c5dc553a69351"} Apr 22 15:36:27.777222 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:27.777224 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-779cbd5446-jz78p" event={"ID":"e1977e29-cbfa-4dfa-910e-a7f5d173c2e9","Type":"ContainerStarted","Data":"c0c01df868e7c567267ca897c72460442c06157af1354f0af8c1cea29389a407"} Apr 22 15:36:27.778132 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:27.778111 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" event={"ID":"7d4bedbf-e56a-428e-be2d-5f1a5111951f","Type":"ContainerStarted","Data":"063ffdc55abe1cf730618615e5d3445943ccbf36f83698267d1b8d8e84dce398"} Apr 22 15:36:27.797506 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:27.797446 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-779cbd5446-jz78p" podStartSLOduration=33.797427517 podStartE2EDuration="33.797427517s" podCreationTimestamp="2026-04-22 15:35:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:36:27.795923865 +0000 UTC m=+126.077077506" watchObservedRunningTime="2026-04-22 15:36:27.797427517 +0000 UTC m=+126.078581159" Apr 22 15:36:27.850114 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:27.850075 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:27.852677 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:27.852652 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:28.781063 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:28.781033 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:28.782366 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:28.782344 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-779cbd5446-jz78p" Apr 22 15:36:29.785029 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:29.784987 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" event={"ID":"7d4bedbf-e56a-428e-be2d-5f1a5111951f","Type":"ContainerStarted","Data":"e15cd8286d85505d42e4af507148757a5a00a39fb9823038c2b6d49e56515c11"} Apr 22 15:36:29.827304 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:29.827239 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-qdtnd" podStartSLOduration=33.509485541 podStartE2EDuration="35.82722001s" podCreationTimestamp="2026-04-22 15:35:54 +0000 UTC" firstStartedPulling="2026-04-22 15:36:26.983835891 +0000 UTC m=+125.264989515" lastFinishedPulling="2026-04-22 15:36:29.301570351 +0000 UTC m=+127.582723984" observedRunningTime="2026-04-22 15:36:29.826232211 +0000 UTC m=+128.107385852" watchObservedRunningTime="2026-04-22 15:36:29.82722001 +0000 UTC m=+128.108373653" Apr 22 15:36:31.030537 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:31.030488 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:36:31.033022 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:31.032996 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c2e72f84-cb38-472e-abba-c2f44adaf2fd-metrics-certs\") pod \"network-metrics-daemon-82tqk\" (UID: \"c2e72f84-cb38-472e-abba-c2f44adaf2fd\") " pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:36:31.096915 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:31.096882 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-qxbwk\"" Apr 22 15:36:31.105107 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:31.105082 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-82tqk" Apr 22 15:36:31.226046 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:31.226008 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-82tqk"] Apr 22 15:36:31.229021 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:31.228988 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2e72f84_cb38_472e_abba_c2f44adaf2fd.slice/crio-1718090e759c1ad8466f32f4f6ca0841a13a1a7b1ab0d3a31a57b7d09e8de570 WatchSource:0}: Error finding container 1718090e759c1ad8466f32f4f6ca0841a13a1a7b1ab0d3a31a57b7d09e8de570: Status 404 returned error can't find the container with id 1718090e759c1ad8466f32f4f6ca0841a13a1a7b1ab0d3a31a57b7d09e8de570 Apr 22 15:36:31.790244 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:31.790205 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-82tqk" event={"ID":"c2e72f84-cb38-472e-abba-c2f44adaf2fd","Type":"ContainerStarted","Data":"1718090e759c1ad8466f32f4f6ca0841a13a1a7b1ab0d3a31a57b7d09e8de570"} Apr 22 15:36:32.161221 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.161172 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-qsrlq"] Apr 22 15:36:32.166257 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.166230 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.169732 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.169491 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-6mcjg\"" Apr 22 15:36:32.172351 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.172326 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 15:36:32.172541 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.172326 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 15:36:32.188780 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.188745 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsrlq"] Apr 22 15:36:32.340322 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.340264 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxlqs\" (UniqueName: \"kubernetes.io/projected/15ffedbe-fc51-476b-a53a-95c611e46693-kube-api-access-mxlqs\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.340499 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.340356 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15ffedbe-fc51-476b-a53a-95c611e46693-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.340499 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.340385 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15ffedbe-fc51-476b-a53a-95c611e46693-data-volume\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.340499 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.340410 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15ffedbe-fc51-476b-a53a-95c611e46693-crio-socket\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.340499 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.340442 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15ffedbe-fc51-476b-a53a-95c611e46693-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.441716 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.441664 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15ffedbe-fc51-476b-a53a-95c611e46693-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.441854 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.441721 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15ffedbe-fc51-476b-a53a-95c611e46693-data-volume\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.441854 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.441759 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15ffedbe-fc51-476b-a53a-95c611e46693-crio-socket\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.441854 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.441792 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15ffedbe-fc51-476b-a53a-95c611e46693-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.442024 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.441857 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxlqs\" (UniqueName: \"kubernetes.io/projected/15ffedbe-fc51-476b-a53a-95c611e46693-kube-api-access-mxlqs\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.442203 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.442176 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/15ffedbe-fc51-476b-a53a-95c611e46693-data-volume\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.442319 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.442300 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/15ffedbe-fc51-476b-a53a-95c611e46693-crio-socket\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.442820 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.442790 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/15ffedbe-fc51-476b-a53a-95c611e46693-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.445478 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.445451 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/15ffedbe-fc51-476b-a53a-95c611e46693-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.454694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.454666 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxlqs\" (UniqueName: \"kubernetes.io/projected/15ffedbe-fc51-476b-a53a-95c611e46693-kube-api-access-mxlqs\") pod \"insights-runtime-extractor-qsrlq\" (UID: \"15ffedbe-fc51-476b-a53a-95c611e46693\") " pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.477513 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.477473 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-qsrlq" Apr 22 15:36:32.617456 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.617416 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-qsrlq"] Apr 22 15:36:32.621038 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:32.621003 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15ffedbe_fc51_476b_a53a_95c611e46693.slice/crio-b5ea8ebe464e223822a8d2e65a29b7d2fc7759b745a3c52056dec3743d10158b WatchSource:0}: Error finding container b5ea8ebe464e223822a8d2e65a29b7d2fc7759b745a3c52056dec3743d10158b: Status 404 returned error can't find the container with id b5ea8ebe464e223822a8d2e65a29b7d2fc7759b745a3c52056dec3743d10158b Apr 22 15:36:32.794086 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.794050 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-82tqk" event={"ID":"c2e72f84-cb38-472e-abba-c2f44adaf2fd","Type":"ContainerStarted","Data":"eb56d62771b60556541b157f0792e7a7cb64b79e97057189c375e1c44bf97d58"} Apr 22 15:36:32.794086 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.794086 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-82tqk" event={"ID":"c2e72f84-cb38-472e-abba-c2f44adaf2fd","Type":"ContainerStarted","Data":"88b49b27b33b7cd23c587f1d16200f8cd72715b722647db39630cf1a8d9372f0"} Apr 22 15:36:32.795380 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.795344 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrlq" event={"ID":"15ffedbe-fc51-476b-a53a-95c611e46693","Type":"ContainerStarted","Data":"99b5527a1b5b3373ad94d69fc855c729e3e674a938ac16b1a413248a0f7c9dd8"} Apr 22 15:36:32.795380 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.795375 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrlq" event={"ID":"15ffedbe-fc51-476b-a53a-95c611e46693","Type":"ContainerStarted","Data":"b5ea8ebe464e223822a8d2e65a29b7d2fc7759b745a3c52056dec3743d10158b"} Apr 22 15:36:32.816189 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:32.816131 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-82tqk" podStartSLOduration=129.619958654 podStartE2EDuration="2m10.816114789s" podCreationTimestamp="2026-04-22 15:34:22 +0000 UTC" firstStartedPulling="2026-04-22 15:36:31.230908298 +0000 UTC m=+129.512061917" lastFinishedPulling="2026-04-22 15:36:32.427064423 +0000 UTC m=+130.708218052" observedRunningTime="2026-04-22 15:36:32.815988273 +0000 UTC m=+131.097141915" watchObservedRunningTime="2026-04-22 15:36:32.816114789 +0000 UTC m=+131.097268431" Apr 22 15:36:33.799695 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:33.799592 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrlq" event={"ID":"15ffedbe-fc51-476b-a53a-95c611e46693","Type":"ContainerStarted","Data":"b657c802220d76d672e429973069f9df2af822799bf5cde3114040150d5934aa"} Apr 22 15:36:35.807451 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:35.807408 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-qsrlq" event={"ID":"15ffedbe-fc51-476b-a53a-95c611e46693","Type":"ContainerStarted","Data":"dcb0161d7ecb4ed0ad1adad7cc37c2b688e6130297052e39aa7929568442097f"} Apr 22 15:36:35.830317 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:35.830263 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-qsrlq" podStartSLOduration=1.363517791 podStartE2EDuration="3.830247022s" podCreationTimestamp="2026-04-22 15:36:32 +0000 UTC" firstStartedPulling="2026-04-22 15:36:32.694355648 +0000 UTC m=+130.975509268" lastFinishedPulling="2026-04-22 15:36:35.161084869 +0000 UTC m=+133.442238499" observedRunningTime="2026-04-22 15:36:35.82860872 +0000 UTC m=+134.109762366" watchObservedRunningTime="2026-04-22 15:36:35.830247022 +0000 UTC m=+134.111400664" Apr 22 15:36:37.202796 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.202764 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-796f856dd4-g4tnb"] Apr 22 15:36:37.205561 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.205544 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.209270 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.209242 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-sfr9v\"" Apr 22 15:36:37.209447 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.209243 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 15:36:37.209447 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.209352 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 15:36:37.209447 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.209359 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 15:36:37.210886 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.210767 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 15:36:37.210886 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.210807 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 15:36:37.210886 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.210813 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 15:36:37.210886 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.210839 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 15:36:37.213840 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.213783 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 15:36:37.215568 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.215546 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-796f856dd4-g4tnb"] Apr 22 15:36:37.278368 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.278332 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-service-ca\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.278555 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.278378 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhft\" (UniqueName: \"kubernetes.io/projected/5224ca01-2eef-4fc2-a0fa-4b956da404fd-kube-api-access-jzhft\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.278555 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.278449 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-serving-cert\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.278555 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.278475 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-trusted-ca-bundle\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.278555 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.278506 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-config\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.278723 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.278585 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-oauth-config\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.278723 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.278634 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-oauth-serving-cert\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.379760 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.379723 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-oauth-config\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.379891 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.379766 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-oauth-serving-cert\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.379891 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.379804 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-service-ca\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.379891 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.379824 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhft\" (UniqueName: \"kubernetes.io/projected/5224ca01-2eef-4fc2-a0fa-4b956da404fd-kube-api-access-jzhft\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.379891 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.379854 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-serving-cert\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.379891 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.379871 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-trusted-ca-bundle\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.379891 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.379892 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-config\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.380740 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.380708 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-service-ca\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.380847 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.380818 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-trusted-ca-bundle\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.380995 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.380974 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-oauth-serving-cert\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.381240 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.381219 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-config\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.382493 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.382476 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-serving-cert\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.382578 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.382516 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-oauth-config\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.388956 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.388932 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhft\" (UniqueName: \"kubernetes.io/projected/5224ca01-2eef-4fc2-a0fa-4b956da404fd-kube-api-access-jzhft\") pod \"console-796f856dd4-g4tnb\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.515915 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.515823 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:37.642297 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.642263 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-796f856dd4-g4tnb"] Apr 22 15:36:37.645089 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:37.645060 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5224ca01_2eef_4fc2_a0fa_4b956da404fd.slice/crio-69b687d433d98ba219b9dc04a9e2b4c8f503734d6656c9a6d434306c62c4b0c3 WatchSource:0}: Error finding container 69b687d433d98ba219b9dc04a9e2b4c8f503734d6656c9a6d434306c62c4b0c3: Status 404 returned error can't find the container with id 69b687d433d98ba219b9dc04a9e2b4c8f503734d6656c9a6d434306c62c4b0c3 Apr 22 15:36:37.813552 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:37.813449 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796f856dd4-g4tnb" event={"ID":"5224ca01-2eef-4fc2-a0fa-4b956da404fd","Type":"ContainerStarted","Data":"69b687d433d98ba219b9dc04a9e2b4c8f503734d6656c9a6d434306c62c4b0c3"} Apr 22 15:36:40.823821 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:40.823784 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796f856dd4-g4tnb" event={"ID":"5224ca01-2eef-4fc2-a0fa-4b956da404fd","Type":"ContainerStarted","Data":"1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581"} Apr 22 15:36:40.844371 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:40.844316 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-796f856dd4-g4tnb" podStartSLOduration=1.135113324 podStartE2EDuration="3.844298473s" podCreationTimestamp="2026-04-22 15:36:37 +0000 UTC" firstStartedPulling="2026-04-22 15:36:37.64725034 +0000 UTC m=+135.928403961" lastFinishedPulling="2026-04-22 15:36:40.356435479 +0000 UTC m=+138.637589110" observedRunningTime="2026-04-22 15:36:40.84264566 +0000 UTC m=+139.123799302" watchObservedRunningTime="2026-04-22 15:36:40.844298473 +0000 UTC m=+139.125452115" Apr 22 15:36:41.508246 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.508208 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hws6q"] Apr 22 15:36:41.511579 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.511562 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.514568 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.514545 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 15:36:41.514568 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.514550 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 15:36:41.514878 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.514861 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 15:36:41.514934 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.514916 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-pqtf6\"" Apr 22 15:36:41.515784 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.515765 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 15:36:41.617427 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617393 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-wtmp\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617601 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617442 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617601 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617464 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/388711be-4fdc-4e0d-85ec-767a64fbe0ec-metrics-client-ca\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617601 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617486 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-root\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617601 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617518 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-tls\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617601 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617563 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617601 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617580 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-textfile\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617789 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617611 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-sys\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.617789 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.617627 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkks\" (UniqueName: \"kubernetes.io/projected/388711be-4fdc-4e0d-85ec-767a64fbe0ec-kube-api-access-lzkks\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718432 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718394 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718614 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718441 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/388711be-4fdc-4e0d-85ec-767a64fbe0ec-metrics-client-ca\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718614 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718462 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-root\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718614 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718503 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-tls\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718614 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718544 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718614 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718580 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-textfile\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718858 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718633 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-sys\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718858 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718680 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-sys\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718858 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718680 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-root\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718858 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:41.718688 2534 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 15:36:41.718858 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:41.718766 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-tls podName:388711be-4fdc-4e0d-85ec-767a64fbe0ec nodeName:}" failed. No retries permitted until 2026-04-22 15:36:42.218737415 +0000 UTC m=+140.499891039 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-tls") pod "node-exporter-hws6q" (UID: "388711be-4fdc-4e0d-85ec-767a64fbe0ec") : secret "node-exporter-tls" not found Apr 22 15:36:41.718858 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718793 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkks\" (UniqueName: \"kubernetes.io/projected/388711be-4fdc-4e0d-85ec-767a64fbe0ec-kube-api-access-lzkks\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.718858 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.718845 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-wtmp\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.719167 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.719016 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-wtmp\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.719167 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.719068 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-textfile\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.719742 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.719725 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-accelerators-collector-config\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.719880 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.719858 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/388711be-4fdc-4e0d-85ec-767a64fbe0ec-metrics-client-ca\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.721156 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.721133 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:41.729059 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:41.729034 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkks\" (UniqueName: \"kubernetes.io/projected/388711be-4fdc-4e0d-85ec-767a64fbe0ec-kube-api-access-lzkks\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:42.223930 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:42.223891 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-tls\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:42.227941 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:42.227913 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/388711be-4fdc-4e0d-85ec-767a64fbe0ec-node-exporter-tls\") pod \"node-exporter-hws6q\" (UID: \"388711be-4fdc-4e0d-85ec-767a64fbe0ec\") " pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:42.420947 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:42.420909 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hws6q" Apr 22 15:36:42.429507 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:42.429465 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod388711be_4fdc_4e0d_85ec_767a64fbe0ec.slice/crio-a0d68f66283368701318deaeaa3d132e3bee3d32bc35f8bb2b8295df303d0462 WatchSource:0}: Error finding container a0d68f66283368701318deaeaa3d132e3bee3d32bc35f8bb2b8295df303d0462: Status 404 returned error can't find the container with id a0d68f66283368701318deaeaa3d132e3bee3d32bc35f8bb2b8295df303d0462 Apr 22 15:36:42.830218 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:42.830174 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hws6q" event={"ID":"388711be-4fdc-4e0d-85ec-767a64fbe0ec","Type":"ContainerStarted","Data":"a0d68f66283368701318deaeaa3d132e3bee3d32bc35f8bb2b8295df303d0462"} Apr 22 15:36:43.833981 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:43.833941 2534 generic.go:358] "Generic (PLEG): container finished" podID="388711be-4fdc-4e0d-85ec-767a64fbe0ec" containerID="c24182a2990d6bbf0739be92e679d623bf6dda6957b61d258fb94705a58d4065" exitCode=0 Apr 22 15:36:43.834351 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:43.833994 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hws6q" event={"ID":"388711be-4fdc-4e0d-85ec-767a64fbe0ec","Type":"ContainerDied","Data":"c24182a2990d6bbf0739be92e679d623bf6dda6957b61d258fb94705a58d4065"} Apr 22 15:36:44.519237 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.519197 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-6947f4988d-hdlkz"] Apr 22 15:36:44.521864 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.521843 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.524597 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.524574 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 15:36:44.524751 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.524574 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 15:36:44.524751 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.524662 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-ld4nd\"" Apr 22 15:36:44.524862 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.524853 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 15:36:44.525042 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.525028 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 15:36:44.525608 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.525595 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 15:36:44.525653 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.525626 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-9rl5r7mt7u8pb\"" Apr 22 15:36:44.538849 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.538816 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6947f4988d-hdlkz"] Apr 22 15:36:44.543844 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.543814 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvr8\" (UniqueName: \"kubernetes.io/projected/1919ccae-c280-4f0e-908b-93a4d05a4ecb-kube-api-access-mvvr8\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.544020 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.543851 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.544020 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.543893 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-grpc-tls\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.544020 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.543971 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1919ccae-c280-4f0e-908b-93a4d05a4ecb-metrics-client-ca\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.544020 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.544010 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.544254 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.544078 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.544254 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.544104 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.544254 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.544182 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-tls\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645138 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645087 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvr8\" (UniqueName: \"kubernetes.io/projected/1919ccae-c280-4f0e-908b-93a4d05a4ecb-kube-api-access-mvvr8\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645138 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645140 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645405 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645192 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-grpc-tls\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645405 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645224 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1919ccae-c280-4f0e-908b-93a4d05a4ecb-metrics-client-ca\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645405 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645249 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645405 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645285 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645405 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645323 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.645405 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.645375 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-tls\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.646073 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.646028 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1919ccae-c280-4f0e-908b-93a4d05a4ecb-metrics-client-ca\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.648326 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.648287 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.648470 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.648303 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-tls\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.648583 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.648560 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.648722 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.648704 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.648771 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.648719 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.648815 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.648786 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1919ccae-c280-4f0e-908b-93a4d05a4ecb-secret-grpc-tls\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.655592 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.655556 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvr8\" (UniqueName: \"kubernetes.io/projected/1919ccae-c280-4f0e-908b-93a4d05a4ecb-kube-api-access-mvvr8\") pod \"thanos-querier-6947f4988d-hdlkz\" (UID: \"1919ccae-c280-4f0e-908b-93a4d05a4ecb\") " pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.831678 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.831583 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:44.838026 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.837992 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hws6q" event={"ID":"388711be-4fdc-4e0d-85ec-767a64fbe0ec","Type":"ContainerStarted","Data":"733e4f56acaf297c7f4c304c0e63db679eda4971aba6708791cf0504111115f2"} Apr 22 15:36:44.838368 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.838033 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hws6q" event={"ID":"388711be-4fdc-4e0d-85ec-767a64fbe0ec","Type":"ContainerStarted","Data":"4dac51fff33705e59cc9b33e5a2edfdbb534c29b2f358331d00393d9c5c38618"} Apr 22 15:36:44.862303 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.862233 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hws6q" podStartSLOduration=3.063184701 podStartE2EDuration="3.862211929s" podCreationTimestamp="2026-04-22 15:36:41 +0000 UTC" firstStartedPulling="2026-04-22 15:36:42.431241452 +0000 UTC m=+140.712395071" lastFinishedPulling="2026-04-22 15:36:43.230268664 +0000 UTC m=+141.511422299" observedRunningTime="2026-04-22 15:36:44.860813223 +0000 UTC m=+143.141966865" watchObservedRunningTime="2026-04-22 15:36:44.862211929 +0000 UTC m=+143.143365567" Apr 22 15:36:44.974368 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:44.974340 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-6947f4988d-hdlkz"] Apr 22 15:36:44.976762 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:44.976730 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1919ccae_c280_4f0e_908b_93a4d05a4ecb.slice/crio-6865e66d13051c750fa9329d5badf48e6866da4a4ce2c5fa7e1a42f28d3397a3 WatchSource:0}: Error finding container 6865e66d13051c750fa9329d5badf48e6866da4a4ce2c5fa7e1a42f28d3397a3: Status 404 returned error can't find the container with id 6865e66d13051c750fa9329d5badf48e6866da4a4ce2c5fa7e1a42f28d3397a3 Apr 22 15:36:45.842409 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:45.842361 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" event={"ID":"1919ccae-c280-4f0e-908b-93a4d05a4ecb","Type":"ContainerStarted","Data":"6865e66d13051c750fa9329d5badf48e6866da4a4ce2c5fa7e1a42f28d3397a3"} Apr 22 15:36:46.308942 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.308904 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn"] Apr 22 15:36:46.310935 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.310917 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:46.313566 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.313537 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-hvcqk\"" Apr 22 15:36:46.313789 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.313774 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 15:36:46.319949 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.319921 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn"] Apr 22 15:36:46.360912 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.360873 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hbnxn\" (UID: \"202e188a-bbd5-4a7c-b299-df34fb70814b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:46.461519 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.461481 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hbnxn\" (UID: \"202e188a-bbd5-4a7c-b299-df34fb70814b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:46.461762 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:46.461647 2534 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 15:36:46.461762 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:46.461727 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert podName:202e188a-bbd5-4a7c-b299-df34fb70814b nodeName:}" failed. No retries permitted until 2026-04-22 15:36:46.961707346 +0000 UTC m=+145.242860982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-hbnxn" (UID: "202e188a-bbd5-4a7c-b299-df34fb70814b") : secret "monitoring-plugin-cert" not found Apr 22 15:36:46.966952 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:46.966910 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hbnxn\" (UID: \"202e188a-bbd5-4a7c-b299-df34fb70814b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:46.967383 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:46.967075 2534 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 15:36:46.967383 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:46.967159 2534 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert podName:202e188a-bbd5-4a7c-b299-df34fb70814b nodeName:}" failed. No retries permitted until 2026-04-22 15:36:47.967138116 +0000 UTC m=+146.248291736 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-hbnxn" (UID: "202e188a-bbd5-4a7c-b299-df34fb70814b") : secret "monitoring-plugin-cert" not found Apr 22 15:36:47.516919 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.516855 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:47.517096 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.517025 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:47.521865 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.521837 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:47.769321 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.769232 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:36:47.771972 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.771953 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.775390 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.775366 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 15:36:47.775511 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.775366 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8dmq8\"" Apr 22 15:36:47.775511 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.775410 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 15:36:47.776347 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776328 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8hjm818pe1kap\"" Apr 22 15:36:47.776468 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776446 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 15:36:47.776468 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776335 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 15:36:47.776609 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776375 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 15:36:47.776609 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776494 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 15:36:47.776609 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776369 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 15:36:47.776768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776730 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 15:36:47.776813 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.776778 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 15:36:47.777411 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.777392 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 15:36:47.777508 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.777451 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:36:47.777583 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.777509 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 15:36:47.781189 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.781167 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 15:36:47.787600 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.787578 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:36:47.861477 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.861434 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" event={"ID":"1919ccae-c280-4f0e-908b-93a4d05a4ecb","Type":"ContainerStarted","Data":"09dedaf6ea99ca0b66660645a5e6796c9d97fec9312dc28805d1e74ff8950fbd"} Apr 22 15:36:47.861683 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.861483 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" event={"ID":"1919ccae-c280-4f0e-908b-93a4d05a4ecb","Type":"ContainerStarted","Data":"f77cad2796eb32ee9183a2d854a28bf4368c55a06a8d4dbd572d07f6762226b1"} Apr 22 15:36:47.861683 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.861501 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" event={"ID":"1919ccae-c280-4f0e-908b-93a4d05a4ecb","Type":"ContainerStarted","Data":"6badac58d96e0e1576b59963d5a1fecccc4bd4389f17e408ebe041309801b61b"} Apr 22 15:36:47.866787 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.866721 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:36:47.873381 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873351 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873588 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873387 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvpcl\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-kube-api-access-lvpcl\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873588 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873434 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873588 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873560 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-config\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873715 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873604 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873715 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873675 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873715 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873708 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873863 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873743 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873863 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873822 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873951 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873899 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.873951 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873928 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.874028 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.873978 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.874114 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.874093 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-config-out\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.874154 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.874130 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-web-config\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.874196 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.874167 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.874278 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.874197 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.874278 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.874219 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.874278 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.874247 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.974712 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.974670 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.974712 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.974718 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.974743 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.974793 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-config-out\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975025 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-web-config\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975078 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975136 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975155 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975193 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975218 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975248 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975243 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvpcl\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-kube-api-access-lvpcl\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975285 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975331 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-config\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975358 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975415 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975458 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975492 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hbnxn\" (UID: \"202e188a-bbd5-4a7c-b299-df34fb70814b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975562 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.975908 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.975612 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.976280 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.976087 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.976280 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.976113 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.977240 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.977011 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.978012 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.977860 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.978012 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.977946 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-config-out\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.978347 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.978310 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-web-config\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.978661 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.978632 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.978823 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.978801 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.978823 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.978814 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.978985 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.978947 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-config\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.979180 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.979160 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/202e188a-bbd5-4a7c-b299-df34fb70814b-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-hbnxn\" (UID: \"202e188a-bbd5-4a7c-b299-df34fb70814b\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:47.979546 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.979498 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.981341 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.981314 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.981549 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.981497 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.981781 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.981760 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.981884 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.981870 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.982008 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.981987 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.982041 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.981994 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:47.989197 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:47.989169 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvpcl\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-kube-api-access-lvpcl\") pod \"prometheus-k8s-0\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:48.082260 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.082169 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:48.121332 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.121286 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:48.239352 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.239309 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:36:48.275164 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.275108 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn"] Apr 22 15:36:48.337150 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:48.337085 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod321042bf_be77_487c_ae63_1ca45d4906e3.slice/crio-2c663dd0d73ec3393a2503621ed751a8c18863115475ca81822f903b7c706497 WatchSource:0}: Error finding container 2c663dd0d73ec3393a2503621ed751a8c18863115475ca81822f903b7c706497: Status 404 returned error can't find the container with id 2c663dd0d73ec3393a2503621ed751a8c18863115475ca81822f903b7c706497 Apr 22 15:36:48.337606 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:36:48.337579 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202e188a_bbd5_4a7c_b299_df34fb70814b.slice/crio-daa744777e5ebf7780963eedb4ccbc6e7495f892042f69e7373dac38a5ebbad1 WatchSource:0}: Error finding container daa744777e5ebf7780963eedb4ccbc6e7495f892042f69e7373dac38a5ebbad1: Status 404 returned error can't find the container with id daa744777e5ebf7780963eedb4ccbc6e7495f892042f69e7373dac38a5ebbad1 Apr 22 15:36:48.868092 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.868049 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" event={"ID":"1919ccae-c280-4f0e-908b-93a4d05a4ecb","Type":"ContainerStarted","Data":"06457dff0611c83a5b69f9486b96756a85e944fd59f437356f234dee7c617028"} Apr 22 15:36:48.868092 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.868099 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" event={"ID":"1919ccae-c280-4f0e-908b-93a4d05a4ecb","Type":"ContainerStarted","Data":"fe92325ad72933799853d1e81479aea0522b9bd1e4a5d07618fc4ae4c1af4242"} Apr 22 15:36:48.868354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.868118 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" event={"ID":"1919ccae-c280-4f0e-908b-93a4d05a4ecb","Type":"ContainerStarted","Data":"0d0565f1dfada9d386fe903d05675eac9b56f50ae1f661d40ab1f8dbe776db97"} Apr 22 15:36:48.868354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.868268 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:48.869332 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.869300 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" event={"ID":"202e188a-bbd5-4a7c-b299-df34fb70814b","Type":"ContainerStarted","Data":"daa744777e5ebf7780963eedb4ccbc6e7495f892042f69e7373dac38a5ebbad1"} Apr 22 15:36:48.870543 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.870498 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerStarted","Data":"2c663dd0d73ec3393a2503621ed751a8c18863115475ca81822f903b7c706497"} Apr 22 15:36:48.893872 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:48.893820 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" podStartSLOduration=1.510478989 podStartE2EDuration="4.893800725s" podCreationTimestamp="2026-04-22 15:36:44 +0000 UTC" firstStartedPulling="2026-04-22 15:36:44.97866688 +0000 UTC m=+143.259820501" lastFinishedPulling="2026-04-22 15:36:48.36198861 +0000 UTC m=+146.643142237" observedRunningTime="2026-04-22 15:36:48.893370287 +0000 UTC m=+147.174523930" watchObservedRunningTime="2026-04-22 15:36:48.893800725 +0000 UTC m=+147.174954364" Apr 22 15:36:49.876122 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:49.876088 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" event={"ID":"202e188a-bbd5-4a7c-b299-df34fb70814b","Type":"ContainerStarted","Data":"25de6350565af328493d2057851799fcb20a875849310605157855d192babb86"} Apr 22 15:36:49.876599 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:49.876240 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:49.877551 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:49.877509 2534 generic.go:358] "Generic (PLEG): container finished" podID="321042bf-be77-487c-ae63-1ca45d4906e3" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" exitCode=0 Apr 22 15:36:49.877692 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:49.877557 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a"} Apr 22 15:36:49.881662 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:49.881644 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" Apr 22 15:36:49.891617 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:49.891477 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-hbnxn" podStartSLOduration=2.574467414 podStartE2EDuration="3.891463602s" podCreationTimestamp="2026-04-22 15:36:46 +0000 UTC" firstStartedPulling="2026-04-22 15:36:48.339372728 +0000 UTC m=+146.620526347" lastFinishedPulling="2026-04-22 15:36:49.656368911 +0000 UTC m=+147.937522535" observedRunningTime="2026-04-22 15:36:49.890363596 +0000 UTC m=+148.171517250" watchObservedRunningTime="2026-04-22 15:36:49.891463602 +0000 UTC m=+148.172617273" Apr 22 15:36:54.884324 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.884291 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-6947f4988d-hdlkz" Apr 22 15:36:54.894229 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.894196 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerStarted","Data":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} Apr 22 15:36:54.894229 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.894233 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerStarted","Data":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} Apr 22 15:36:54.894428 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.894243 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerStarted","Data":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} Apr 22 15:36:54.894428 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.894251 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerStarted","Data":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} Apr 22 15:36:54.894428 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.894261 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerStarted","Data":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} Apr 22 15:36:54.894428 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.894269 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerStarted","Data":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} Apr 22 15:36:54.939549 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:54.939463 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.1862892 podStartE2EDuration="7.939439736s" podCreationTimestamp="2026-04-22 15:36:47 +0000 UTC" firstStartedPulling="2026-04-22 15:36:48.339057914 +0000 UTC m=+146.620211533" lastFinishedPulling="2026-04-22 15:36:54.092208449 +0000 UTC m=+152.373362069" observedRunningTime="2026-04-22 15:36:54.936774421 +0000 UTC m=+153.217928063" watchObservedRunningTime="2026-04-22 15:36:54.939439736 +0000 UTC m=+153.220593384" Apr 22 15:36:58.082361 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:58.082321 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:36:58.090425 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:58.090380 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-cqvj8" podUID="aec47240-1952-48af-b917-7fdd3074710a" Apr 22 15:36:58.099443 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:36:58.099399 2534 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-fshk9" podUID="b91d8a6d-a426-4263-ae63-99ecd3ff6949" Apr 22 15:36:58.906550 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:58.906503 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cqvj8" Apr 22 15:36:59.876397 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:36:59.876353 2534 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-796f856dd4-g4tnb"] Apr 22 15:37:03.018337 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.018289 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:37:03.018792 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.018367 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:37:03.021017 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.020987 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aec47240-1952-48af-b917-7fdd3074710a-metrics-tls\") pod \"dns-default-cqvj8\" (UID: \"aec47240-1952-48af-b917-7fdd3074710a\") " pod="openshift-dns/dns-default-cqvj8" Apr 22 15:37:03.021126 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.021039 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b91d8a6d-a426-4263-ae63-99ecd3ff6949-cert\") pod \"ingress-canary-fshk9\" (UID: \"b91d8a6d-a426-4263-ae63-99ecd3ff6949\") " pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:37:03.109898 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.109862 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xgzsv\"" Apr 22 15:37:03.118315 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.118281 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-cqvj8" Apr 22 15:37:03.249040 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.249010 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-cqvj8"] Apr 22 15:37:03.251647 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:37:03.251617 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaec47240_1952_48af_b917_7fdd3074710a.slice/crio-8210a761ec5cf745b28dbbc6ebbc42504babe28a8ee3b6b16d10309e9f707913 WatchSource:0}: Error finding container 8210a761ec5cf745b28dbbc6ebbc42504babe28a8ee3b6b16d10309e9f707913: Status 404 returned error can't find the container with id 8210a761ec5cf745b28dbbc6ebbc42504babe28a8ee3b6b16d10309e9f707913 Apr 22 15:37:03.920945 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:03.920896 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqvj8" event={"ID":"aec47240-1952-48af-b917-7fdd3074710a","Type":"ContainerStarted","Data":"8210a761ec5cf745b28dbbc6ebbc42504babe28a8ee3b6b16d10309e9f707913"} Apr 22 15:37:04.925450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:04.925409 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqvj8" event={"ID":"aec47240-1952-48af-b917-7fdd3074710a","Type":"ContainerStarted","Data":"0681c73b892f9c49be0051d07d31b37a08328e457b36bb859a73252d5ec9275c"} Apr 22 15:37:04.925450 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:04.925445 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-cqvj8" event={"ID":"aec47240-1952-48af-b917-7fdd3074710a","Type":"ContainerStarted","Data":"d27602cf34b2187d70969744b7601f0d17dd90c1d6d75a238f44c580a8ab43cf"} Apr 22 15:37:04.925893 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:04.925542 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-cqvj8" Apr 22 15:37:04.948836 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:04.948777 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-cqvj8" podStartSLOduration=128.592930343 podStartE2EDuration="2m9.948760769s" podCreationTimestamp="2026-04-22 15:34:55 +0000 UTC" firstStartedPulling="2026-04-22 15:37:03.25354643 +0000 UTC m=+161.534700051" lastFinishedPulling="2026-04-22 15:37:04.609376857 +0000 UTC m=+162.890530477" observedRunningTime="2026-04-22 15:37:04.947450439 +0000 UTC m=+163.228604083" watchObservedRunningTime="2026-04-22 15:37:04.948760769 +0000 UTC m=+163.229914410" Apr 22 15:37:09.282192 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:09.282099 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:37:09.285087 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:09.285066 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fpsm4\"" Apr 22 15:37:09.293441 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:09.293408 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-fshk9" Apr 22 15:37:09.427277 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:09.427244 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-fshk9"] Apr 22 15:37:09.430494 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:37:09.430461 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb91d8a6d_a426_4263_ae63_99ecd3ff6949.slice/crio-eb6be0372077ddfd4d68bc2bd1a869651bfc1174da22ca44fe5c748b0caa1271 WatchSource:0}: Error finding container eb6be0372077ddfd4d68bc2bd1a869651bfc1174da22ca44fe5c748b0caa1271: Status 404 returned error can't find the container with id eb6be0372077ddfd4d68bc2bd1a869651bfc1174da22ca44fe5c748b0caa1271 Apr 22 15:37:09.941475 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:09.941433 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fshk9" event={"ID":"b91d8a6d-a426-4263-ae63-99ecd3ff6949","Type":"ContainerStarted","Data":"eb6be0372077ddfd4d68bc2bd1a869651bfc1174da22ca44fe5c748b0caa1271"} Apr 22 15:37:11.948731 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:11.948684 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-fshk9" event={"ID":"b91d8a6d-a426-4263-ae63-99ecd3ff6949","Type":"ContainerStarted","Data":"ad2b4e1f869f9ed18c852bd7edc78c5b578e69cc707c66dc615c51e561f67090"} Apr 22 15:37:11.964474 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:11.964421 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-fshk9" podStartSLOduration=135.193361231 podStartE2EDuration="2m16.96440659s" podCreationTimestamp="2026-04-22 15:34:55 +0000 UTC" firstStartedPulling="2026-04-22 15:37:09.432438145 +0000 UTC m=+167.713591765" lastFinishedPulling="2026-04-22 15:37:11.203483504 +0000 UTC m=+169.484637124" observedRunningTime="2026-04-22 15:37:11.964117954 +0000 UTC m=+170.245271815" watchObservedRunningTime="2026-04-22 15:37:11.96440659 +0000 UTC m=+170.245560231" Apr 22 15:37:14.931231 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:14.931202 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-cqvj8" Apr 22 15:37:18.970533 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:18.970496 2534 generic.go:358] "Generic (PLEG): container finished" podID="4269d0dd-d09b-4927-96de-1b3ab59b5ec7" containerID="20eed9b73a36b05843e6abb2a9bb5f27fb5edd4a43a31be6c97401016260e5de" exitCode=0 Apr 22 15:37:18.970912 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:18.970579 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d45rg" event={"ID":"4269d0dd-d09b-4927-96de-1b3ab59b5ec7","Type":"ContainerDied","Data":"20eed9b73a36b05843e6abb2a9bb5f27fb5edd4a43a31be6c97401016260e5de"} Apr 22 15:37:18.970956 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:18.970938 2534 scope.go:117] "RemoveContainer" containerID="20eed9b73a36b05843e6abb2a9bb5f27fb5edd4a43a31be6c97401016260e5de" Apr 22 15:37:19.974754 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:19.974723 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-d45rg" event={"ID":"4269d0dd-d09b-4927-96de-1b3ab59b5ec7","Type":"ContainerStarted","Data":"4a22b51910eac91683eebc6d9b36f3206b30c60520f15d3f9a452a644864aae0"} Apr 22 15:37:24.898280 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:24.898214 2534 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-796f856dd4-g4tnb" podUID="5224ca01-2eef-4fc2-a0fa-4b956da404fd" containerName="console" containerID="cri-o://1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581" gracePeriod=15 Apr 22 15:37:25.144420 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.144396 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-796f856dd4-g4tnb_5224ca01-2eef-4fc2-a0fa-4b956da404fd/console/0.log" Apr 22 15:37:25.144575 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.144471 2534 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:37:25.214575 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214504 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-config\") pod \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " Apr 22 15:37:25.214758 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214586 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-trusted-ca-bundle\") pod \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " Apr 22 15:37:25.214758 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214684 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzhft\" (UniqueName: \"kubernetes.io/projected/5224ca01-2eef-4fc2-a0fa-4b956da404fd-kube-api-access-jzhft\") pod \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " Apr 22 15:37:25.214758 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214715 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-oauth-serving-cert\") pod \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " Apr 22 15:37:25.214898 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214774 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-serving-cert\") pod \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " Apr 22 15:37:25.214898 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214800 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-oauth-config\") pod \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " Apr 22 15:37:25.214898 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214860 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-service-ca\") pod \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\" (UID: \"5224ca01-2eef-4fc2-a0fa-4b956da404fd\") " Apr 22 15:37:25.215033 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.214960 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-config" (OuterVolumeSpecName: "console-config") pod "5224ca01-2eef-4fc2-a0fa-4b956da404fd" (UID: "5224ca01-2eef-4fc2-a0fa-4b956da404fd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:25.215109 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.215082 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5224ca01-2eef-4fc2-a0fa-4b956da404fd" (UID: "5224ca01-2eef-4fc2-a0fa-4b956da404fd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:25.215232 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.215208 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5224ca01-2eef-4fc2-a0fa-4b956da404fd" (UID: "5224ca01-2eef-4fc2-a0fa-4b956da404fd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:25.215309 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.215237 2534 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-trusted-ca-bundle\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:37:25.215309 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.215248 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-service-ca" (OuterVolumeSpecName: "service-ca") pod "5224ca01-2eef-4fc2-a0fa-4b956da404fd" (UID: "5224ca01-2eef-4fc2-a0fa-4b956da404fd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:37:25.215309 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.215264 2534 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-config\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:37:25.217223 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.217194 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5224ca01-2eef-4fc2-a0fa-4b956da404fd" (UID: "5224ca01-2eef-4fc2-a0fa-4b956da404fd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:37:25.217223 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.217206 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5224ca01-2eef-4fc2-a0fa-4b956da404fd-kube-api-access-jzhft" (OuterVolumeSpecName: "kube-api-access-jzhft") pod "5224ca01-2eef-4fc2-a0fa-4b956da404fd" (UID: "5224ca01-2eef-4fc2-a0fa-4b956da404fd"). InnerVolumeSpecName "kube-api-access-jzhft". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:37:25.217365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.217210 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5224ca01-2eef-4fc2-a0fa-4b956da404fd" (UID: "5224ca01-2eef-4fc2-a0fa-4b956da404fd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:37:25.316557 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.316489 2534 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jzhft\" (UniqueName: \"kubernetes.io/projected/5224ca01-2eef-4fc2-a0fa-4b956da404fd-kube-api-access-jzhft\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:37:25.316557 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.316519 2534 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-oauth-serving-cert\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:37:25.316557 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.316559 2534 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-serving-cert\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:37:25.316836 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.316574 2534 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5224ca01-2eef-4fc2-a0fa-4b956da404fd-console-oauth-config\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:37:25.316836 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.316587 2534 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5224ca01-2eef-4fc2-a0fa-4b956da404fd-service-ca\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:37:25.995323 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.995298 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-796f856dd4-g4tnb_5224ca01-2eef-4fc2-a0fa-4b956da404fd/console/0.log" Apr 22 15:37:25.995783 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.995335 2534 generic.go:358] "Generic (PLEG): container finished" podID="5224ca01-2eef-4fc2-a0fa-4b956da404fd" containerID="1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581" exitCode=2 Apr 22 15:37:25.995783 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.995371 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796f856dd4-g4tnb" event={"ID":"5224ca01-2eef-4fc2-a0fa-4b956da404fd","Type":"ContainerDied","Data":"1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581"} Apr 22 15:37:25.995783 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.995400 2534 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-796f856dd4-g4tnb" Apr 22 15:37:25.995783 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.995413 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-796f856dd4-g4tnb" event={"ID":"5224ca01-2eef-4fc2-a0fa-4b956da404fd","Type":"ContainerDied","Data":"69b687d433d98ba219b9dc04a9e2b4c8f503734d6656c9a6d434306c62c4b0c3"} Apr 22 15:37:25.995783 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:25.995428 2534 scope.go:117] "RemoveContainer" containerID="1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581" Apr 22 15:37:26.004578 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:26.004558 2534 scope.go:117] "RemoveContainer" containerID="1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581" Apr 22 15:37:26.004862 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:37:26.004844 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581\": container with ID starting with 1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581 not found: ID does not exist" containerID="1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581" Apr 22 15:37:26.004917 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:26.004871 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581"} err="failed to get container status \"1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581\": rpc error: code = NotFound desc = could not find container \"1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581\": container with ID starting with 1e3d427aead18166e29107dc88617ab6474134c09be9eb1f5a17a6e003941581 not found: ID does not exist" Apr 22 15:37:26.017436 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:26.017406 2534 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-796f856dd4-g4tnb"] Apr 22 15:37:26.025577 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:26.025548 2534 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-796f856dd4-g4tnb"] Apr 22 15:37:26.286713 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:26.286606 2534 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5224ca01-2eef-4fc2-a0fa-4b956da404fd" path="/var/lib/kubelet/pods/5224ca01-2eef-4fc2-a0fa-4b956da404fd/volumes" Apr 22 15:37:29.005685 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:29.005653 2534 generic.go:358] "Generic (PLEG): container finished" podID="6fc37a4f-3b0b-435c-a8c6-ceab191ad796" containerID="5637f15d0dc83cf6b22a30eba023801c3a3a14de14210e52f9d40e9bd3fa7c7f" exitCode=0 Apr 22 15:37:29.006219 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:29.005731 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" event={"ID":"6fc37a4f-3b0b-435c-a8c6-ceab191ad796","Type":"ContainerDied","Data":"5637f15d0dc83cf6b22a30eba023801c3a3a14de14210e52f9d40e9bd3fa7c7f"} Apr 22 15:37:29.006219 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:29.006080 2534 scope.go:117] "RemoveContainer" containerID="5637f15d0dc83cf6b22a30eba023801c3a3a14de14210e52f9d40e9bd3fa7c7f" Apr 22 15:37:30.010366 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:30.010329 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-gp7j9" event={"ID":"6fc37a4f-3b0b-435c-a8c6-ceab191ad796","Type":"ContainerStarted","Data":"61e06c14d5dabab401e374faf244ec6588b493abbf61bb79b6f100d63863a069"} Apr 22 15:37:48.082836 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:48.082789 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:37:48.100371 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:48.100340 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:37:49.085981 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:37:49.085951 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:06.143318 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.143273 2534 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:38:06.143840 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.143789 2534 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="prometheus" containerID="cri-o://d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" gracePeriod=600 Apr 22 15:38:06.143917 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.143850 2534 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="thanos-sidecar" containerID="cri-o://2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" gracePeriod=600 Apr 22 15:38:06.143917 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.143834 2534 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-thanos" containerID="cri-o://6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" gracePeriod=600 Apr 22 15:38:06.144022 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.143937 2534 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="config-reloader" containerID="cri-o://35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" gracePeriod=600 Apr 22 15:38:06.144022 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.143937 2534 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-web" containerID="cri-o://244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" gracePeriod=600 Apr 22 15:38:06.144111 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.144029 2534 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy" containerID="cri-o://b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" gracePeriod=600 Apr 22 15:38:06.389057 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.389031 2534 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:06.561809 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561770 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-db\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.561809 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561813 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-trusted-ca-bundle\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561833 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-kube-rbac-proxy\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561856 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-config\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561882 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561914 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvpcl\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-kube-api-access-lvpcl\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561949 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-thanos-prometheus-http-client-file\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.561982 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-metrics-client-ca\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562014 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-web-config\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562081 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562043 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-metrics-client-certs\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562095 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-config-out\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562126 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562158 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-kubelet-serving-ca-bundle\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562191 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-serving-certs-ca-bundle\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562226 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-tls\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562230 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562254 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-grpc-tls\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562297 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-tls-assets\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.562461 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562323 2534 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-rulefiles-0\") pod \"321042bf-be77-487c-ae63-1ca45d4906e3\" (UID: \"321042bf-be77-487c-ae63-1ca45d4906e3\") " Apr 22 15:38:06.563104 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.562676 2534 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-trusted-ca-bundle\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.563104 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.563089 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:06.563808 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.563509 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:06.563808 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.563636 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:38:06.564111 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.564017 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:06.565403 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.565080 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 15:38:06.565403 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.565161 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-config" (OuterVolumeSpecName: "config") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.565604 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.565545 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-config-out" (OuterVolumeSpecName: "config-out") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 15:38:06.565730 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.565700 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.565804 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.565782 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.566809 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.566617 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-kube-api-access-lvpcl" (OuterVolumeSpecName: "kube-api-access-lvpcl") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "kube-api-access-lvpcl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:38:06.566809 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.566706 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.566982 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.566854 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.567204 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.567175 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.567404 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.567384 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.567986 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.567966 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.568057 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.567990 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 15:38:06.577909 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.577873 2534 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-web-config" (OuterVolumeSpecName: "web-config") pod "321042bf-be77-487c-ae63-1ca45d4906e3" (UID: "321042bf-be77-487c-ae63-1ca45d4906e3"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 15:38:06.663677 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663638 2534 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-thanos-prometheus-http-client-file\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663677 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663671 2534 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-metrics-client-ca\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663687 2534 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-web-config\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663701 2534 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-metrics-client-certs\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663712 2534 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-config-out\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663725 2534 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663736 2534 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663748 2534 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663760 2534 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-tls\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663772 2534 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-grpc-tls\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663784 2534 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-tls-assets\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663796 2534 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663807 2534 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/321042bf-be77-487c-ae63-1ca45d4906e3-prometheus-k8s-db\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663822 2534 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-kube-rbac-proxy\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663833 2534 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-config\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663846 2534 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/321042bf-be77-487c-ae63-1ca45d4906e3-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:06.663920 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:06.663859 2534 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvpcl\" (UniqueName: \"kubernetes.io/projected/321042bf-be77-487c-ae63-1ca45d4906e3-kube-api-access-lvpcl\") on node \"ip-10-0-130-86.ec2.internal\" DevicePath \"\"" Apr 22 15:38:07.121705 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121673 2534 generic.go:358] "Generic (PLEG): container finished" podID="321042bf-be77-487c-ae63-1ca45d4906e3" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" exitCode=0 Apr 22 15:38:07.121705 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121700 2534 generic.go:358] "Generic (PLEG): container finished" podID="321042bf-be77-487c-ae63-1ca45d4906e3" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" exitCode=0 Apr 22 15:38:07.121705 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121706 2534 generic.go:358] "Generic (PLEG): container finished" podID="321042bf-be77-487c-ae63-1ca45d4906e3" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" exitCode=0 Apr 22 15:38:07.121705 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121712 2534 generic.go:358] "Generic (PLEG): container finished" podID="321042bf-be77-487c-ae63-1ca45d4906e3" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" exitCode=0 Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121717 2534 generic.go:358] "Generic (PLEG): container finished" podID="321042bf-be77-487c-ae63-1ca45d4906e3" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" exitCode=0 Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121722 2534 generic.go:358] "Generic (PLEG): container finished" podID="321042bf-be77-487c-ae63-1ca45d4906e3" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" exitCode=0 Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121765 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121780 2534 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121810 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121822 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121832 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121840 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121849 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121858 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"321042bf-be77-487c-ae63-1ca45d4906e3","Type":"ContainerDied","Data":"2c663dd0d73ec3393a2503621ed751a8c18863115475ca81822f903b7c706497"} Apr 22 15:38:07.121962 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.121876 2534 scope.go:117] "RemoveContainer" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.130058 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.130037 2534 scope.go:117] "RemoveContainer" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.137486 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.137457 2534 scope.go:117] "RemoveContainer" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.145426 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.145258 2534 scope.go:117] "RemoveContainer" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.147664 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.147202 2534 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:38:07.151501 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.150865 2534 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:38:07.171538 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.171486 2534 scope.go:117] "RemoveContainer" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.179034 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.178997 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:38:07.179348 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179333 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="config-reloader" Apr 22 15:38:07.179384 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179351 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="config-reloader" Apr 22 15:38:07.179384 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179368 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5224ca01-2eef-4fc2-a0fa-4b956da404fd" containerName="console" Apr 22 15:38:07.179384 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179374 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="5224ca01-2eef-4fc2-a0fa-4b956da404fd" containerName="console" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179384 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="init-config-reloader" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179390 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="init-config-reloader" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179396 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="prometheus" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179401 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="prometheus" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179411 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-thanos" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179416 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-thanos" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179424 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179430 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179435 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="thanos-sidecar" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179440 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="thanos-sidecar" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179445 2534 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-web" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179450 2534 state_mem.go:107] "Deleted CPUSet assignment" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-web" Apr 22 15:38:07.179482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179488 2534 memory_manager.go:356] "RemoveStaleState removing state" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-thanos" Apr 22 15:38:07.179961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179495 2534 memory_manager.go:356] "RemoveStaleState removing state" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="config-reloader" Apr 22 15:38:07.179961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179501 2534 memory_manager.go:356] "RemoveStaleState removing state" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="prometheus" Apr 22 15:38:07.179961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179508 2534 memory_manager.go:356] "RemoveStaleState removing state" podUID="5224ca01-2eef-4fc2-a0fa-4b956da404fd" containerName="console" Apr 22 15:38:07.179961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179514 2534 memory_manager.go:356] "RemoveStaleState removing state" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="thanos-sidecar" Apr 22 15:38:07.179961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179542 2534 memory_manager.go:356] "RemoveStaleState removing state" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy" Apr 22 15:38:07.179961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179551 2534 memory_manager.go:356] "RemoveStaleState removing state" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" containerName="kube-rbac-proxy-web" Apr 22 15:38:07.179961 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.179673 2534 scope.go:117] "RemoveContainer" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.183747 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.183726 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.186370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186342 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 15:38:07.186792 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186757 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 15:38:07.186946 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186783 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8hjm818pe1kap\"" Apr 22 15:38:07.187093 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186806 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 15:38:07.187093 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186809 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 15:38:07.187365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.187107 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 15:38:07.187365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186843 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 15:38:07.187365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186877 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-8dmq8\"" Apr 22 15:38:07.187365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186882 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 15:38:07.187365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.187223 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 15:38:07.187365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.186938 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 15:38:07.187935 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.187914 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 15:38:07.188017 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.187986 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 15:38:07.188319 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.188304 2534 scope.go:117] "RemoveContainer" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.190730 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.190713 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 15:38:07.193203 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.193182 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 15:38:07.197422 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.197392 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:38:07.199313 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.199288 2534 scope.go:117] "RemoveContainer" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.199689 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:38:07.199666 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": container with ID starting with 6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce not found: ID does not exist" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.199792 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.199703 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} err="failed to get container status \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": rpc error: code = NotFound desc = could not find container \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": container with ID starting with 6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce not found: ID does not exist" Apr 22 15:38:07.199792 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.199733 2534 scope.go:117] "RemoveContainer" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.200048 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:38:07.200022 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": container with ID starting with b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad not found: ID does not exist" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.200140 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200056 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} err="failed to get container status \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": rpc error: code = NotFound desc = could not find container \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": container with ID starting with b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad not found: ID does not exist" Apr 22 15:38:07.200140 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200074 2534 scope.go:117] "RemoveContainer" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.200334 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:38:07.200312 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": container with ID starting with 244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25 not found: ID does not exist" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.200407 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200342 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} err="failed to get container status \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": rpc error: code = NotFound desc = could not find container \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": container with ID starting with 244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25 not found: ID does not exist" Apr 22 15:38:07.200407 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200359 2534 scope.go:117] "RemoveContainer" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.200668 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:38:07.200648 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": container with ID starting with 2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e not found: ID does not exist" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.200750 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200672 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} err="failed to get container status \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": rpc error: code = NotFound desc = could not find container \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": container with ID starting with 2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e not found: ID does not exist" Apr 22 15:38:07.200750 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200688 2534 scope.go:117] "RemoveContainer" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.200941 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:38:07.200924 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": container with ID starting with 35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14 not found: ID does not exist" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.200999 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200944 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} err="failed to get container status \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": rpc error: code = NotFound desc = could not find container \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": container with ID starting with 35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14 not found: ID does not exist" Apr 22 15:38:07.200999 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.200963 2534 scope.go:117] "RemoveContainer" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.201187 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:38:07.201169 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": container with ID starting with d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f not found: ID does not exist" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.201249 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201191 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} err="failed to get container status \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": rpc error: code = NotFound desc = could not find container \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": container with ID starting with d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f not found: ID does not exist" Apr 22 15:38:07.201249 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201205 2534 scope.go:117] "RemoveContainer" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.201432 ip-10-0-130-86 kubenswrapper[2534]: E0422 15:38:07.201415 2534 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": container with ID starting with 5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a not found: ID does not exist" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.201501 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201435 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a"} err="failed to get container status \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": rpc error: code = NotFound desc = could not find container \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": container with ID starting with 5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a not found: ID does not exist" Apr 22 15:38:07.201501 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201449 2534 scope.go:117] "RemoveContainer" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.201669 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201647 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} err="failed to get container status \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": rpc error: code = NotFound desc = could not find container \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": container with ID starting with 6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce not found: ID does not exist" Apr 22 15:38:07.201669 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201669 2534 scope.go:117] "RemoveContainer" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.201889 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201869 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} err="failed to get container status \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": rpc error: code = NotFound desc = could not find container \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": container with ID starting with b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad not found: ID does not exist" Apr 22 15:38:07.201889 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.201889 2534 scope.go:117] "RemoveContainer" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.202076 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202058 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} err="failed to get container status \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": rpc error: code = NotFound desc = could not find container \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": container with ID starting with 244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25 not found: ID does not exist" Apr 22 15:38:07.202138 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202077 2534 scope.go:117] "RemoveContainer" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.202261 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202246 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} err="failed to get container status \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": rpc error: code = NotFound desc = could not find container \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": container with ID starting with 2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e not found: ID does not exist" Apr 22 15:38:07.202261 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202260 2534 scope.go:117] "RemoveContainer" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.202482 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202463 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} err="failed to get container status \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": rpc error: code = NotFound desc = could not find container \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": container with ID starting with 35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14 not found: ID does not exist" Apr 22 15:38:07.202588 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202482 2534 scope.go:117] "RemoveContainer" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.202698 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202680 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} err="failed to get container status \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": rpc error: code = NotFound desc = could not find container \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": container with ID starting with d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f not found: ID does not exist" Apr 22 15:38:07.202698 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202697 2534 scope.go:117] "RemoveContainer" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.202901 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202882 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a"} err="failed to get container status \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": rpc error: code = NotFound desc = could not find container \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": container with ID starting with 5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a not found: ID does not exist" Apr 22 15:38:07.202901 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.202901 2534 scope.go:117] "RemoveContainer" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.203113 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203098 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} err="failed to get container status \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": rpc error: code = NotFound desc = could not find container \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": container with ID starting with 6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce not found: ID does not exist" Apr 22 15:38:07.203113 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203112 2534 scope.go:117] "RemoveContainer" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.203299 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203281 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} err="failed to get container status \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": rpc error: code = NotFound desc = could not find container \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": container with ID starting with b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad not found: ID does not exist" Apr 22 15:38:07.203367 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203300 2534 scope.go:117] "RemoveContainer" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.203493 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203477 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} err="failed to get container status \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": rpc error: code = NotFound desc = could not find container \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": container with ID starting with 244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25 not found: ID does not exist" Apr 22 15:38:07.203590 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203493 2534 scope.go:117] "RemoveContainer" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.203739 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203721 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} err="failed to get container status \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": rpc error: code = NotFound desc = could not find container \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": container with ID starting with 2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e not found: ID does not exist" Apr 22 15:38:07.203806 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203739 2534 scope.go:117] "RemoveContainer" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.203924 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203907 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} err="failed to get container status \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": rpc error: code = NotFound desc = could not find container \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": container with ID starting with 35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14 not found: ID does not exist" Apr 22 15:38:07.203924 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.203924 2534 scope.go:117] "RemoveContainer" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.204113 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204096 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} err="failed to get container status \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": rpc error: code = NotFound desc = could not find container \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": container with ID starting with d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f not found: ID does not exist" Apr 22 15:38:07.204181 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204115 2534 scope.go:117] "RemoveContainer" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.204302 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204278 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a"} err="failed to get container status \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": rpc error: code = NotFound desc = could not find container \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": container with ID starting with 5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a not found: ID does not exist" Apr 22 15:38:07.204363 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204302 2534 scope.go:117] "RemoveContainer" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.204491 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204475 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} err="failed to get container status \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": rpc error: code = NotFound desc = could not find container \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": container with ID starting with 6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce not found: ID does not exist" Apr 22 15:38:07.204491 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204490 2534 scope.go:117] "RemoveContainer" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.204727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204708 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} err="failed to get container status \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": rpc error: code = NotFound desc = could not find container \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": container with ID starting with b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad not found: ID does not exist" Apr 22 15:38:07.204727 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204728 2534 scope.go:117] "RemoveContainer" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.204932 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204912 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} err="failed to get container status \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": rpc error: code = NotFound desc = could not find container \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": container with ID starting with 244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25 not found: ID does not exist" Apr 22 15:38:07.204932 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.204931 2534 scope.go:117] "RemoveContainer" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.205126 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205109 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} err="failed to get container status \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": rpc error: code = NotFound desc = could not find container \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": container with ID starting with 2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e not found: ID does not exist" Apr 22 15:38:07.205195 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205126 2534 scope.go:117] "RemoveContainer" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.205331 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205316 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} err="failed to get container status \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": rpc error: code = NotFound desc = could not find container \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": container with ID starting with 35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14 not found: ID does not exist" Apr 22 15:38:07.205400 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205331 2534 scope.go:117] "RemoveContainer" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.205508 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205493 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} err="failed to get container status \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": rpc error: code = NotFound desc = could not find container \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": container with ID starting with d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f not found: ID does not exist" Apr 22 15:38:07.205508 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205507 2534 scope.go:117] "RemoveContainer" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.205754 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205736 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a"} err="failed to get container status \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": rpc error: code = NotFound desc = could not find container \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": container with ID starting with 5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a not found: ID does not exist" Apr 22 15:38:07.205754 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205752 2534 scope.go:117] "RemoveContainer" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.205958 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205937 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} err="failed to get container status \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": rpc error: code = NotFound desc = could not find container \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": container with ID starting with 6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce not found: ID does not exist" Apr 22 15:38:07.206027 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.205959 2534 scope.go:117] "RemoveContainer" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.206145 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206129 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} err="failed to get container status \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": rpc error: code = NotFound desc = could not find container \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": container with ID starting with b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad not found: ID does not exist" Apr 22 15:38:07.206145 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206144 2534 scope.go:117] "RemoveContainer" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.206305 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206283 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} err="failed to get container status \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": rpc error: code = NotFound desc = could not find container \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": container with ID starting with 244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25 not found: ID does not exist" Apr 22 15:38:07.206370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206305 2534 scope.go:117] "RemoveContainer" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.206477 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206460 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} err="failed to get container status \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": rpc error: code = NotFound desc = could not find container \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": container with ID starting with 2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e not found: ID does not exist" Apr 22 15:38:07.206477 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206476 2534 scope.go:117] "RemoveContainer" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.206677 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206661 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} err="failed to get container status \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": rpc error: code = NotFound desc = could not find container \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": container with ID starting with 35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14 not found: ID does not exist" Apr 22 15:38:07.206677 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206677 2534 scope.go:117] "RemoveContainer" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.206876 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206856 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} err="failed to get container status \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": rpc error: code = NotFound desc = could not find container \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": container with ID starting with d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f not found: ID does not exist" Apr 22 15:38:07.206940 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.206876 2534 scope.go:117] "RemoveContainer" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.207112 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207094 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a"} err="failed to get container status \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": rpc error: code = NotFound desc = could not find container \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": container with ID starting with 5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a not found: ID does not exist" Apr 22 15:38:07.207170 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207115 2534 scope.go:117] "RemoveContainer" containerID="6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce" Apr 22 15:38:07.207365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207343 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce"} err="failed to get container status \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": rpc error: code = NotFound desc = could not find container \"6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce\": container with ID starting with 6465e06c29b38252b98e634e89a3392eddd3e08f16e3893b31477b668676c6ce not found: ID does not exist" Apr 22 15:38:07.207365 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207363 2534 scope.go:117] "RemoveContainer" containerID="b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad" Apr 22 15:38:07.207593 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207577 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad"} err="failed to get container status \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": rpc error: code = NotFound desc = could not find container \"b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad\": container with ID starting with b3c083dca68d20b9a9bfabcadbdefd7cde08d04552f896ece0fa510bbe6056ad not found: ID does not exist" Apr 22 15:38:07.207652 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207593 2534 scope.go:117] "RemoveContainer" containerID="244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25" Apr 22 15:38:07.207786 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207769 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25"} err="failed to get container status \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": rpc error: code = NotFound desc = could not find container \"244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25\": container with ID starting with 244ad09f7d26dcfa0c5b97954c5bacfe6e2f8ac37177150d2f26c2a836057e25 not found: ID does not exist" Apr 22 15:38:07.207853 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207787 2534 scope.go:117] "RemoveContainer" containerID="2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e" Apr 22 15:38:07.207989 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207972 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e"} err="failed to get container status \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": rpc error: code = NotFound desc = could not find container \"2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e\": container with ID starting with 2aa03b572d1b8c5bc4f6826ec6cc835ae41f113dd9a481049f04dc0cf4de346e not found: ID does not exist" Apr 22 15:38:07.208038 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.207989 2534 scope.go:117] "RemoveContainer" containerID="35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14" Apr 22 15:38:07.208160 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.208146 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14"} err="failed to get container status \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": rpc error: code = NotFound desc = could not find container \"35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14\": container with ID starting with 35d00b806e10c9030a92e72a48f1ac2b345c260c6255385bbeb916192965fe14 not found: ID does not exist" Apr 22 15:38:07.208207 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.208161 2534 scope.go:117] "RemoveContainer" containerID="d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f" Apr 22 15:38:07.208385 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.208358 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f"} err="failed to get container status \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": rpc error: code = NotFound desc = could not find container \"d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f\": container with ID starting with d01bbada08d83b9c1644178c2f2a22565288706e9e92a2e36dbcbbed0baf684f not found: ID does not exist" Apr 22 15:38:07.208451 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.208387 2534 scope.go:117] "RemoveContainer" containerID="5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a" Apr 22 15:38:07.208661 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.208644 2534 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a"} err="failed to get container status \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": rpc error: code = NotFound desc = could not find container \"5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a\": container with ID starting with 5217b5b7541c94c8cf5ce7d2fdd33fff6425e55168d402cd9c05fdc2d4fd254a not found: ID does not exist" Apr 22 15:38:07.369098 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369053 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369098 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369102 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369163 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369181 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-web-config\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369210 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369287 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369337 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369585 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369368 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd8wn\" (UniqueName: \"kubernetes.io/projected/4f00f379-693a-4eac-9e3f-e627b0765443-kube-api-access-jd8wn\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369585 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369398 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369585 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369436 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369585 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369473 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-config\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369585 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369509 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f00f379-693a-4eac-9e3f-e627b0765443-config-out\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369585 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369568 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369585 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369585 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f00f379-693a-4eac-9e3f-e627b0765443-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369861 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369634 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369861 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369663 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369861 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369690 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.369861 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.369720 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471138 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471097 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471138 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471141 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-web-config\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471168 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471197 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471233 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471260 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd8wn\" (UniqueName: \"kubernetes.io/projected/4f00f379-693a-4eac-9e3f-e627b0765443-kube-api-access-jd8wn\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471288 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471314 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471338 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-config\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471370 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471362 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f00f379-693a-4eac-9e3f-e627b0765443-config-out\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471393 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471416 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f00f379-693a-4eac-9e3f-e627b0765443-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471445 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471469 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471494 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471557 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471614 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.471768 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471650 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475253 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.471955 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475253 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.472126 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475253 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.472172 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475253 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.473661 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475253 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.474231 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475253 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.474957 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475798 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.475391 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.475873 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.475823 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-config\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.476549 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.476506 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/4f00f379-693a-4eac-9e3f-e627b0765443-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.476652 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.476626 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.476898 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.476824 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.477214 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.477195 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-web-config\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.477311 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.477250 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.477412 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.477389 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4f00f379-693a-4eac-9e3f-e627b0765443-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.477568 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.477516 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.477638 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.477619 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/4f00f379-693a-4eac-9e3f-e627b0765443-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.478369 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.478352 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4f00f379-693a-4eac-9e3f-e627b0765443-config-out\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.481222 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.481202 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd8wn\" (UniqueName: \"kubernetes.io/projected/4f00f379-693a-4eac-9e3f-e627b0765443-kube-api-access-jd8wn\") pod \"prometheus-k8s-0\" (UID: \"4f00f379-693a-4eac-9e3f-e627b0765443\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.495201 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.495170 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:38:07.639247 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:07.639205 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 15:38:07.642743 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:38:07.642704 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f00f379_693a_4eac_9e3f_e627b0765443.slice/crio-75d132745ebfb7949362fc5683e7f6f28a540888305f8621e1954e56a8c3c2f7 WatchSource:0}: Error finding container 75d132745ebfb7949362fc5683e7f6f28a540888305f8621e1954e56a8c3c2f7: Status 404 returned error can't find the container with id 75d132745ebfb7949362fc5683e7f6f28a540888305f8621e1954e56a8c3c2f7 Apr 22 15:38:08.126376 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:08.126278 2534 generic.go:358] "Generic (PLEG): container finished" podID="4f00f379-693a-4eac-9e3f-e627b0765443" containerID="49477fae76b48074b577010aa8c3749e635401b7ca0c4ac3023a9fa3b3f736ff" exitCode=0 Apr 22 15:38:08.126547 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:08.126378 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerDied","Data":"49477fae76b48074b577010aa8c3749e635401b7ca0c4ac3023a9fa3b3f736ff"} Apr 22 15:38:08.126547 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:08.126417 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerStarted","Data":"75d132745ebfb7949362fc5683e7f6f28a540888305f8621e1954e56a8c3c2f7"} Apr 22 15:38:08.287289 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:08.287253 2534 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="321042bf-be77-487c-ae63-1ca45d4906e3" path="/var/lib/kubelet/pods/321042bf-be77-487c-ae63-1ca45d4906e3/volumes" Apr 22 15:38:09.133491 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:09.133449 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerStarted","Data":"872ff2b14dee64c310d45198ed34820720c7d63096b2602e36aa7dc73851cc70"} Apr 22 15:38:09.133491 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:09.133485 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerStarted","Data":"a4823d0dca7ccd3c06e61c462a38f54dd888b7a703a1c907813a55820b34f44c"} Apr 22 15:38:09.133491 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:09.133497 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerStarted","Data":"5b8160c30d6b0b2c59b5c51c000d4479d93799f4bc2ad7e6f31f61e5ceccb2cf"} Apr 22 15:38:09.133741 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:09.133505 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerStarted","Data":"666762d2baa4ae4aa4bf62345c933d3286f33cf998b957aef8a1235fe524795b"} Apr 22 15:38:09.133741 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:09.133514 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerStarted","Data":"3d046f681c1c6606b105990cdb9c3c3da488dd02e144cec2917f1b0b27239851"} Apr 22 15:38:09.133741 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:09.133543 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"4f00f379-693a-4eac-9e3f-e627b0765443","Type":"ContainerStarted","Data":"e4b3211eb808b526db75971db56df4ab4220d3aa207ded0d05b53cbcb1f029f0"} Apr 22 15:38:09.162897 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:09.162844 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.162825713 podStartE2EDuration="2.162825713s" podCreationTimestamp="2026-04-22 15:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 15:38:09.1613022 +0000 UTC m=+227.442455841" watchObservedRunningTime="2026-04-22 15:38:09.162825713 +0000 UTC m=+227.443979402" Apr 22 15:38:12.496247 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:38:12.496205 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:39:07.496210 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:39:07.496170 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:39:07.512043 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:39:07.512017 2534 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:39:08.319888 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:39:08.319854 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 15:39:22.171096 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:39:22.171064 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:39:22.171096 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:39:22.171083 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:39:22.183798 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:39:22.183763 2534 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 15:40:01.322599 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.322557 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x7zbb"] Apr 22 15:40:01.325747 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.325727 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.328634 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.328615 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 15:40:01.333979 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.333951 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x7zbb"] Apr 22 15:40:01.470474 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.470435 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-dbus\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.470474 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.470479 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-original-pull-secret\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.470729 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.470511 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-kubelet-config\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.571544 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.571489 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-dbus\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.571725 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.571577 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-original-pull-secret\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.571725 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.571626 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-kubelet-config\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.571725 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.571712 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-dbus\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.571835 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.571733 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-kubelet-config\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.574282 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.574224 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b49e4a28-cf8f-4f96-95ad-594e9ee849d2-original-pull-secret\") pod \"global-pull-secret-syncer-x7zbb\" (UID: \"b49e4a28-cf8f-4f96-95ad-594e9ee849d2\") " pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.635799 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.635758 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x7zbb" Apr 22 15:40:01.758354 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.758321 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x7zbb"] Apr 22 15:40:01.760849 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:40:01.760815 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb49e4a28_cf8f_4f96_95ad_594e9ee849d2.slice/crio-923ca5baa6609a7bf4b44dde2fe6df1edd73dbb30376cf3dc5b7b771521682c5 WatchSource:0}: Error finding container 923ca5baa6609a7bf4b44dde2fe6df1edd73dbb30376cf3dc5b7b771521682c5: Status 404 returned error can't find the container with id 923ca5baa6609a7bf4b44dde2fe6df1edd73dbb30376cf3dc5b7b771521682c5 Apr 22 15:40:01.762498 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:01.762482 2534 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 15:40:02.457889 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:02.457851 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x7zbb" event={"ID":"b49e4a28-cf8f-4f96-95ad-594e9ee849d2","Type":"ContainerStarted","Data":"923ca5baa6609a7bf4b44dde2fe6df1edd73dbb30376cf3dc5b7b771521682c5"} Apr 22 15:40:07.474349 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:07.474298 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x7zbb" event={"ID":"b49e4a28-cf8f-4f96-95ad-594e9ee849d2","Type":"ContainerStarted","Data":"85f0bdd33091c722a5c3e3a6cc56f0b9578bcdc983635037cdcf3fec88be1dab"} Apr 22 15:40:07.491504 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:40:07.491445 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x7zbb" podStartSLOduration=1.135911627 podStartE2EDuration="6.491428388s" podCreationTimestamp="2026-04-22 15:40:01 +0000 UTC" firstStartedPulling="2026-04-22 15:40:01.762683866 +0000 UTC m=+340.043837493" lastFinishedPulling="2026-04-22 15:40:07.118200634 +0000 UTC m=+345.399354254" observedRunningTime="2026-04-22 15:40:07.490153324 +0000 UTC m=+345.771306967" watchObservedRunningTime="2026-04-22 15:40:07.491428388 +0000 UTC m=+345.772582026" Apr 22 15:41:48.323175 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.323094 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ftq7q"] Apr 22 15:41:48.325642 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.325619 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.328496 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.328468 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 15:41:48.328696 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.328647 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-ncfvh\"" Apr 22 15:41:48.329598 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.329575 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 15:41:48.334041 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.334015 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ftq7q"] Apr 22 15:41:48.517735 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.517699 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ccb1820-b96f-4126-ad83-0db7e43663eb-bound-sa-token\") pod \"cert-manager-79c8d999ff-ftq7q\" (UID: \"8ccb1820-b96f-4126-ad83-0db7e43663eb\") " pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.517936 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.517837 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6wt7\" (UniqueName: \"kubernetes.io/projected/8ccb1820-b96f-4126-ad83-0db7e43663eb-kube-api-access-s6wt7\") pod \"cert-manager-79c8d999ff-ftq7q\" (UID: \"8ccb1820-b96f-4126-ad83-0db7e43663eb\") " pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.618901 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.618805 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s6wt7\" (UniqueName: \"kubernetes.io/projected/8ccb1820-b96f-4126-ad83-0db7e43663eb-kube-api-access-s6wt7\") pod \"cert-manager-79c8d999ff-ftq7q\" (UID: \"8ccb1820-b96f-4126-ad83-0db7e43663eb\") " pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.618901 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.618877 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ccb1820-b96f-4126-ad83-0db7e43663eb-bound-sa-token\") pod \"cert-manager-79c8d999ff-ftq7q\" (UID: \"8ccb1820-b96f-4126-ad83-0db7e43663eb\") " pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.627594 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.627546 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8ccb1820-b96f-4126-ad83-0db7e43663eb-bound-sa-token\") pod \"cert-manager-79c8d999ff-ftq7q\" (UID: \"8ccb1820-b96f-4126-ad83-0db7e43663eb\") " pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.627759 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.627669 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6wt7\" (UniqueName: \"kubernetes.io/projected/8ccb1820-b96f-4126-ad83-0db7e43663eb-kube-api-access-s6wt7\") pod \"cert-manager-79c8d999ff-ftq7q\" (UID: \"8ccb1820-b96f-4126-ad83-0db7e43663eb\") " pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.655730 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.655698 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-ftq7q" Apr 22 15:41:48.797467 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:48.797428 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-ftq7q"] Apr 22 15:41:48.800682 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:41:48.800628 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ccb1820_b96f_4126_ad83_0db7e43663eb.slice/crio-747be8c4ebac539ab4d6389236ce72b62648a29a76d7005d9f774c172d8e1907 WatchSource:0}: Error finding container 747be8c4ebac539ab4d6389236ce72b62648a29a76d7005d9f774c172d8e1907: Status 404 returned error can't find the container with id 747be8c4ebac539ab4d6389236ce72b62648a29a76d7005d9f774c172d8e1907 Apr 22 15:41:49.761093 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:49.761051 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-ftq7q" event={"ID":"8ccb1820-b96f-4126-ad83-0db7e43663eb","Type":"ContainerStarted","Data":"747be8c4ebac539ab4d6389236ce72b62648a29a76d7005d9f774c172d8e1907"} Apr 22 15:41:53.777534 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:53.777483 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-ftq7q" event={"ID":"8ccb1820-b96f-4126-ad83-0db7e43663eb","Type":"ContainerStarted","Data":"3588cfc47edcf387536a0907883cfeb7745566d05b5ab7f3aca6a726df722ad7"} Apr 22 15:41:53.796293 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:41:53.796239 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-ftq7q" podStartSLOduration=1.8115232639999999 podStartE2EDuration="5.796219546s" podCreationTimestamp="2026-04-22 15:41:48 +0000 UTC" firstStartedPulling="2026-04-22 15:41:48.80249177 +0000 UTC m=+447.083645390" lastFinishedPulling="2026-04-22 15:41:52.787188049 +0000 UTC m=+451.068341672" observedRunningTime="2026-04-22 15:41:53.794478737 +0000 UTC m=+452.075632383" watchObservedRunningTime="2026-04-22 15:41:53.796219546 +0000 UTC m=+452.077373187" Apr 22 15:42:36.061764 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.061727 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44"] Apr 22 15:42:36.064538 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.064509 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.067270 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.067239 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 15:42:36.068403 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.068377 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-controller-manager-dockercfg-6jrvz\"" Apr 22 15:42:36.068547 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.068377 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kubeflow-trainer-config\"" Apr 22 15:42:36.068547 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.068385 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kubeflow-trainer-webhook-cert\"" Apr 22 15:42:36.068547 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.068438 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 15:42:36.072694 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.072669 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44"] Apr 22 15:42:36.199295 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.199235 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/9fe0d64d-0ff5-4278-9e59-be977a0aa104-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.199492 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.199320 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jljph\" (UniqueName: \"kubernetes.io/projected/9fe0d64d-0ff5-4278-9e59-be977a0aa104-kube-api-access-jljph\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.199492 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.199397 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fe0d64d-0ff5-4278-9e59-be977a0aa104-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.300316 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.300274 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/9fe0d64d-0ff5-4278-9e59-be977a0aa104-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.300316 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.300318 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jljph\" (UniqueName: \"kubernetes.io/projected/9fe0d64d-0ff5-4278-9e59-be977a0aa104-kube-api-access-jljph\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.300570 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.300356 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fe0d64d-0ff5-4278-9e59-be977a0aa104-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.301056 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.301034 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubeflow-trainer-config\" (UniqueName: \"kubernetes.io/configmap/9fe0d64d-0ff5-4278-9e59-be977a0aa104-kubeflow-trainer-config\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.302911 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.302883 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9fe0d64d-0ff5-4278-9e59-be977a0aa104-cert\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.309735 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.309709 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jljph\" (UniqueName: \"kubernetes.io/projected/9fe0d64d-0ff5-4278-9e59-be977a0aa104-kube-api-access-jljph\") pod \"kubeflow-trainer-controller-manager-55f5694779-tkl44\" (UID: \"9fe0d64d-0ff5-4278-9e59-be977a0aa104\") " pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.374814 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.374706 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:36.500606 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.500572 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44"] Apr 22 15:42:36.504219 ip-10-0-130-86 kubenswrapper[2534]: W0422 15:42:36.504188 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fe0d64d_0ff5_4278_9e59_be977a0aa104.slice/crio-ad97bad44c6a0d336949b1119e3e6f6ef062ad4beff6cfe3f5bbbc2f4b1e3016 WatchSource:0}: Error finding container ad97bad44c6a0d336949b1119e3e6f6ef062ad4beff6cfe3f5bbbc2f4b1e3016: Status 404 returned error can't find the container with id ad97bad44c6a0d336949b1119e3e6f6ef062ad4beff6cfe3f5bbbc2f4b1e3016 Apr 22 15:42:36.904593 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:36.904554 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" event={"ID":"9fe0d64d-0ff5-4278-9e59-be977a0aa104","Type":"ContainerStarted","Data":"ad97bad44c6a0d336949b1119e3e6f6ef062ad4beff6cfe3f5bbbc2f4b1e3016"} Apr 22 15:42:38.912133 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:38.912089 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" event={"ID":"9fe0d64d-0ff5-4278-9e59-be977a0aa104","Type":"ContainerStarted","Data":"a19a3fd51c2d63f4bbde537d84bb6b37c7b77568bfb1535ecdf005408b3ba30b"} Apr 22 15:42:38.912537 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:38.912209 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:42:38.948973 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:38.948906 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" podStartSLOduration=0.735656596 podStartE2EDuration="2.948884157s" podCreationTimestamp="2026-04-22 15:42:36 +0000 UTC" firstStartedPulling="2026-04-22 15:42:36.506327888 +0000 UTC m=+494.787481508" lastFinishedPulling="2026-04-22 15:42:38.719555449 +0000 UTC m=+497.000709069" observedRunningTime="2026-04-22 15:42:38.94581319 +0000 UTC m=+497.226966844" watchObservedRunningTime="2026-04-22 15:42:38.948884157 +0000 UTC m=+497.230037799" Apr 22 15:42:54.921421 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:42:54.921388 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kubeflow-trainer-controller-manager-55f5694779-tkl44" Apr 22 15:44:22.197706 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:44:22.197674 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:44:22.198806 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:44:22.198786 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:49:22.221367 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:49:22.221278 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:49:22.224061 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:49:22.224035 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:54:22.244314 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:54:22.244269 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:54:22.246892 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:54:22.245814 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:59:22.271645 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:59:22.271500 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 15:59:22.275765 ip-10-0-130-86 kubenswrapper[2534]: I0422 15:59:22.272933 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:04:22.293069 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:04:22.292958 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:04:22.295543 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:04:22.295508 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:09:22.316440 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:09:22.316309 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:09:22.320427 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:09:22.319013 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:14:22.337210 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:14:22.337095 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:14:22.341228 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:14:22.339853 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:19:22.358802 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:19:22.358686 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:19:22.363022 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:19:22.360811 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:24:22.379503 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:24:22.379387 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:24:22.384293 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:24:22.384274 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:27:26.037415 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:27:26.037379 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-tkl44_9fe0d64d-0ff5-4278-9e59-be977a0aa104/manager/0.log" Apr 22 16:27:26.535271 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:27:26.535237 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-tkl44_9fe0d64d-0ff5-4278-9e59-be977a0aa104/manager/0.log" Apr 22 16:27:27.016814 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:27:27.016782 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kubeflow-trainer-controller-manager-55f5694779-tkl44_9fe0d64d-0ff5-4278-9e59-be977a0aa104/manager/0.log" Apr 22 16:28:08.493817 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.493779 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrh8b/must-gather-xbf8q"] Apr 22 16:28:08.496767 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.496743 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.499297 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.499272 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrh8b\"/\"kube-root-ca.crt\"" Apr 22 16:28:08.499424 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.499409 2534 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-jrh8b\"/\"openshift-service-ca.crt\"" Apr 22 16:28:08.506259 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.506231 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/must-gather-xbf8q"] Apr 22 16:28:08.580976 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.580931 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6b9907b-5ce5-4902-adb7-de602d605ecc-must-gather-output\") pod \"must-gather-xbf8q\" (UID: \"c6b9907b-5ce5-4902-adb7-de602d605ecc\") " pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.581147 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.581012 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mld\" (UniqueName: \"kubernetes.io/projected/c6b9907b-5ce5-4902-adb7-de602d605ecc-kube-api-access-55mld\") pod \"must-gather-xbf8q\" (UID: \"c6b9907b-5ce5-4902-adb7-de602d605ecc\") " pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.682039 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.681949 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6b9907b-5ce5-4902-adb7-de602d605ecc-must-gather-output\") pod \"must-gather-xbf8q\" (UID: \"c6b9907b-5ce5-4902-adb7-de602d605ecc\") " pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.682039 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.682020 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55mld\" (UniqueName: \"kubernetes.io/projected/c6b9907b-5ce5-4902-adb7-de602d605ecc-kube-api-access-55mld\") pod \"must-gather-xbf8q\" (UID: \"c6b9907b-5ce5-4902-adb7-de602d605ecc\") " pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.682318 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.682297 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c6b9907b-5ce5-4902-adb7-de602d605ecc-must-gather-output\") pod \"must-gather-xbf8q\" (UID: \"c6b9907b-5ce5-4902-adb7-de602d605ecc\") " pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.690740 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.690707 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mld\" (UniqueName: \"kubernetes.io/projected/c6b9907b-5ce5-4902-adb7-de602d605ecc-kube-api-access-55mld\") pod \"must-gather-xbf8q\" (UID: \"c6b9907b-5ce5-4902-adb7-de602d605ecc\") " pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.807080 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.806979 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/must-gather-xbf8q" Apr 22 16:28:08.937704 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.937650 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/must-gather-xbf8q"] Apr 22 16:28:08.940055 ip-10-0-130-86 kubenswrapper[2534]: W0422 16:28:08.940017 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6b9907b_5ce5_4902_adb7_de602d605ecc.slice/crio-4dbf3f8804a4bdc6a1acacb74cf1c9c1633f7d16be42f697008cc1f3e93c50b4 WatchSource:0}: Error finding container 4dbf3f8804a4bdc6a1acacb74cf1c9c1633f7d16be42f697008cc1f3e93c50b4: Status 404 returned error can't find the container with id 4dbf3f8804a4bdc6a1acacb74cf1c9c1633f7d16be42f697008cc1f3e93c50b4 Apr 22 16:28:08.941779 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:08.941756 2534 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:28:09.735126 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:09.735090 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/must-gather-xbf8q" event={"ID":"c6b9907b-5ce5-4902-adb7-de602d605ecc","Type":"ContainerStarted","Data":"4dbf3f8804a4bdc6a1acacb74cf1c9c1633f7d16be42f697008cc1f3e93c50b4"} Apr 22 16:28:10.742128 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:10.741337 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/must-gather-xbf8q" event={"ID":"c6b9907b-5ce5-4902-adb7-de602d605ecc","Type":"ContainerStarted","Data":"9ee9328375ea07ceac1229ec23d0687cee8b214eb28f5930988eb91fd22f4bbb"} Apr 22 16:28:10.742128 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:10.741383 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/must-gather-xbf8q" event={"ID":"c6b9907b-5ce5-4902-adb7-de602d605ecc","Type":"ContainerStarted","Data":"245a8206b65264839410f5ccac66acf7d3f46ca69447ec370a49b52a513b43c3"} Apr 22 16:28:10.757249 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:10.757189 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrh8b/must-gather-xbf8q" podStartSLOduration=1.899236325 podStartE2EDuration="2.757172423s" podCreationTimestamp="2026-04-22 16:28:08 +0000 UTC" firstStartedPulling="2026-04-22 16:28:08.941908213 +0000 UTC m=+3227.223061837" lastFinishedPulling="2026-04-22 16:28:09.799844316 +0000 UTC m=+3228.080997935" observedRunningTime="2026-04-22 16:28:10.756293115 +0000 UTC m=+3229.037446761" watchObservedRunningTime="2026-04-22 16:28:10.757172423 +0000 UTC m=+3229.038326065" Apr 22 16:28:11.262665 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:11.262625 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x7zbb_b49e4a28-cf8f-4f96-95ad-594e9ee849d2/global-pull-secret-syncer/0.log" Apr 22 16:28:11.309915 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:11.309883 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6vjfh_95598f8a-db85-47f2-859f-d47efcdbfa09/konnectivity-agent/0.log" Apr 22 16:28:11.406518 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:11.406481 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-86.ec2.internal_93607452ba047e869102040d23558016/haproxy/0.log" Apr 22 16:28:15.084286 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.084258 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-qdtnd_7d4bedbf-e56a-428e-be2d-5f1a5111951f/cluster-monitoring-operator/0.log" Apr 22 16:28:15.209830 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.209797 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-hbnxn_202e188a-bbd5-4a7c-b299-df34fb70814b/monitoring-plugin/0.log" Apr 22 16:28:15.327205 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.327166 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hws6q_388711be-4fdc-4e0d-85ec-767a64fbe0ec/node-exporter/0.log" Apr 22 16:28:15.352832 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.352753 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hws6q_388711be-4fdc-4e0d-85ec-767a64fbe0ec/kube-rbac-proxy/0.log" Apr 22 16:28:15.377627 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.377603 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hws6q_388711be-4fdc-4e0d-85ec-767a64fbe0ec/init-textfile/0.log" Apr 22 16:28:15.567968 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.567919 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4f00f379-693a-4eac-9e3f-e627b0765443/prometheus/0.log" Apr 22 16:28:15.590540 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.590490 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4f00f379-693a-4eac-9e3f-e627b0765443/config-reloader/0.log" Apr 22 16:28:15.617297 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.617203 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4f00f379-693a-4eac-9e3f-e627b0765443/thanos-sidecar/0.log" Apr 22 16:28:15.639867 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.639827 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4f00f379-693a-4eac-9e3f-e627b0765443/kube-rbac-proxy-web/0.log" Apr 22 16:28:15.668004 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.667933 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4f00f379-693a-4eac-9e3f-e627b0765443/kube-rbac-proxy/0.log" Apr 22 16:28:15.705727 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.705693 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4f00f379-693a-4eac-9e3f-e627b0765443/kube-rbac-proxy-thanos/0.log" Apr 22 16:28:15.743217 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.743180 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_4f00f379-693a-4eac-9e3f-e627b0765443/init-config-reloader/0.log" Apr 22 16:28:15.980448 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:15.980423 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6947f4988d-hdlkz_1919ccae-c280-4f0e-908b-93a4d05a4ecb/thanos-query/0.log" Apr 22 16:28:16.011023 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:16.010972 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6947f4988d-hdlkz_1919ccae-c280-4f0e-908b-93a4d05a4ecb/kube-rbac-proxy-web/0.log" Apr 22 16:28:16.039333 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:16.039301 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6947f4988d-hdlkz_1919ccae-c280-4f0e-908b-93a4d05a4ecb/kube-rbac-proxy/0.log" Apr 22 16:28:16.061604 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:16.061572 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6947f4988d-hdlkz_1919ccae-c280-4f0e-908b-93a4d05a4ecb/prom-label-proxy/0.log" Apr 22 16:28:16.087042 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:16.086980 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6947f4988d-hdlkz_1919ccae-c280-4f0e-908b-93a4d05a4ecb/kube-rbac-proxy-rules/0.log" Apr 22 16:28:16.109171 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:16.109128 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-6947f4988d-hdlkz_1919ccae-c280-4f0e-908b-93a4d05a4ecb/kube-rbac-proxy-metrics/0.log" Apr 22 16:28:18.279059 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.279014 2534 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l"] Apr 22 16:28:18.284195 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.284160 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.287027 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.286806 2534 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-jrh8b\"/\"default-dockercfg-rrfw9\"" Apr 22 16:28:18.291800 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.291771 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l"] Apr 22 16:28:18.371998 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.371950 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-proc\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.372193 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.372011 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-sys\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.372193 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.372051 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-podres\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.372193 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.372083 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwsn6\" (UniqueName: \"kubernetes.io/projected/c796040e-d80b-4617-8bd5-991ea0eeb5e4-kube-api-access-vwsn6\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.372370 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.372199 2534 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-lib-modules\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.473667 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.473605 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-podres\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.473865 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.473674 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwsn6\" (UniqueName: \"kubernetes.io/projected/c796040e-d80b-4617-8bd5-991ea0eeb5e4-kube-api-access-vwsn6\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.473865 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.473785 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-lib-modules\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.473865 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.473808 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-podres\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.473865 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.473836 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-proc\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.473865 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.473862 2534 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-sys\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.474141 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.473970 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-sys\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.474141 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.474086 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-lib-modules\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.474141 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.474133 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c796040e-d80b-4617-8bd5-991ea0eeb5e4-proc\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.482441 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.482389 2534 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwsn6\" (UniqueName: \"kubernetes.io/projected/c796040e-d80b-4617-8bd5-991ea0eeb5e4-kube-api-access-vwsn6\") pod \"perf-node-gather-daemonset-pfd8l\" (UID: \"c796040e-d80b-4617-8bd5-991ea0eeb5e4\") " pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.598830 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.598749 2534 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:18.747947 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.747909 2534 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l"] Apr 22 16:28:18.752508 ip-10-0-130-86 kubenswrapper[2534]: W0422 16:28:18.752469 2534 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc796040e_d80b_4617_8bd5_991ea0eeb5e4.slice/crio-4380a9946b7f4485cb85aa7fe4c80b507de2157bbaec7c78f79e6db6f6566cc1 WatchSource:0}: Error finding container 4380a9946b7f4485cb85aa7fe4c80b507de2157bbaec7c78f79e6db6f6566cc1: Status 404 returned error can't find the container with id 4380a9946b7f4485cb85aa7fe4c80b507de2157bbaec7c78f79e6db6f6566cc1 Apr 22 16:28:18.772260 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:18.772223 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" event={"ID":"c796040e-d80b-4617-8bd5-991ea0eeb5e4","Type":"ContainerStarted","Data":"4380a9946b7f4485cb85aa7fe4c80b507de2157bbaec7c78f79e6db6f6566cc1"} Apr 22 16:28:19.119547 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:19.119443 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cqvj8_aec47240-1952-48af-b917-7fdd3074710a/dns/0.log" Apr 22 16:28:19.139964 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:19.139914 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-cqvj8_aec47240-1952-48af-b917-7fdd3074710a/kube-rbac-proxy/0.log" Apr 22 16:28:19.255168 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:19.255139 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wbjzt_1c6ce3c0-cf70-422a-a8f8-3889b4bcc3d0/dns-node-resolver/0.log" Apr 22 16:28:19.713053 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:19.713019 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l947q_60d1d72d-9ad8-4148-82bc-8ae8873fe4c8/node-ca/0.log" Apr 22 16:28:19.778011 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:19.777972 2534 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" event={"ID":"c796040e-d80b-4617-8bd5-991ea0eeb5e4","Type":"ContainerStarted","Data":"37abcd110c3eeaaeca942b1ae2913d1f213fd08a4511db49b5aa6feedfa629c8"} Apr 22 16:28:19.778224 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:19.778104 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:19.793111 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:19.793048 2534 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" podStartSLOduration=1.793029175 podStartE2EDuration="1.793029175s" podCreationTimestamp="2026-04-22 16:28:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:28:19.792081836 +0000 UTC m=+3238.073235504" watchObservedRunningTime="2026-04-22 16:28:19.793029175 +0000 UTC m=+3238.074182818" Apr 22 16:28:20.395801 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:20.395768 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-779cbd5446-jz78p_e1977e29-cbfa-4dfa-910e-a7f5d173c2e9/router/0.log" Apr 22 16:28:20.734392 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:20.734357 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-fshk9_b91d8a6d-a426-4263-ae63-99ecd3ff6949/serve-healthcheck-canary/0.log" Apr 22 16:28:21.079778 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:21.079692 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d45rg_4269d0dd-d09b-4927-96de-1b3ab59b5ec7/insights-operator/0.log" Apr 22 16:28:21.080805 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:21.080773 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-d45rg_4269d0dd-d09b-4927-96de-1b3ab59b5ec7/insights-operator/1.log" Apr 22 16:28:21.228765 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:21.228732 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsrlq_15ffedbe-fc51-476b-a53a-95c611e46693/kube-rbac-proxy/0.log" Apr 22 16:28:21.248598 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:21.248567 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsrlq_15ffedbe-fc51-476b-a53a-95c611e46693/exporter/0.log" Apr 22 16:28:21.269252 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:21.269220 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-qsrlq_15ffedbe-fc51-476b-a53a-95c611e46693/extractor/0.log" Apr 22 16:28:25.794602 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:25.794565 2534 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-jrh8b/perf-node-gather-daemonset-pfd8l" Apr 22 16:28:25.897723 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:25.897677 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zfv4q_3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9/migrator/0.log" Apr 22 16:28:25.917845 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:25.917815 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-zfv4q_3ea1ff9e-4077-42b1-a6ea-bfe4d96c42e9/graceful-termination/0.log" Apr 22 16:28:26.195268 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:26.195233 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gp7j9_6fc37a4f-3b0b-435c-a8c6-ceab191ad796/kube-storage-version-migrator-operator/1.log" Apr 22 16:28:26.195468 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:26.195310 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-gp7j9_6fc37a4f-3b0b-435c-a8c6-ceab191ad796/kube-storage-version-migrator-operator/0.log" Apr 22 16:28:27.071219 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.071189 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7b4hc_ca5aa4d2-6513-4631-aeb7-c9120934e117/kube-multus-additional-cni-plugins/0.log" Apr 22 16:28:27.091877 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.091845 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7b4hc_ca5aa4d2-6513-4631-aeb7-c9120934e117/egress-router-binary-copy/0.log" Apr 22 16:28:27.112957 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.112924 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7b4hc_ca5aa4d2-6513-4631-aeb7-c9120934e117/cni-plugins/0.log" Apr 22 16:28:27.135371 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.135345 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7b4hc_ca5aa4d2-6513-4631-aeb7-c9120934e117/bond-cni-plugin/0.log" Apr 22 16:28:27.158972 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.158930 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7b4hc_ca5aa4d2-6513-4631-aeb7-c9120934e117/routeoverride-cni/0.log" Apr 22 16:28:27.181840 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.181810 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7b4hc_ca5aa4d2-6513-4631-aeb7-c9120934e117/whereabouts-cni-bincopy/0.log" Apr 22 16:28:27.206389 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.206356 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7b4hc_ca5aa4d2-6513-4631-aeb7-c9120934e117/whereabouts-cni/0.log" Apr 22 16:28:27.612348 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.612291 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vtcgb_d870e76b-ada6-4b96-8ffa-57ad8f8da412/kube-multus/0.log" Apr 22 16:28:27.637380 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.637349 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-82tqk_c2e72f84-cb38-472e-abba-c2f44adaf2fd/network-metrics-daemon/0.log" Apr 22 16:28:27.658624 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:27.658596 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-82tqk_c2e72f84-cb38-472e-abba-c2f44adaf2fd/kube-rbac-proxy/0.log" Apr 22 16:28:28.745551 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.745493 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-controller/0.log" Apr 22 16:28:28.764301 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.764266 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/0.log" Apr 22 16:28:28.794301 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.794267 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovn-acl-logging/1.log" Apr 22 16:28:28.815341 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.815306 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/kube-rbac-proxy-node/0.log" Apr 22 16:28:28.837005 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.836969 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 16:28:28.855929 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.855899 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/northd/0.log" Apr 22 16:28:28.876347 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.876320 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/nbdb/0.log" Apr 22 16:28:28.900685 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:28.900654 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/sbdb/0.log" Apr 22 16:28:29.061675 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:29.061644 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-fpdtl_92d55e47-0b30-4f98-aff7-3b7325bb3839/ovnkube-controller/0.log" Apr 22 16:28:30.420449 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:30.420421 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-h8d7v_7febb925-9a97-4316-9acd-71100c471eb1/network-check-target-container/0.log" Apr 22 16:28:31.315399 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:31.315371 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hdcgf_f42d3546-8a90-41a3-a8b3-01565e6ed78c/iptables-alerter/0.log" Apr 22 16:28:31.960293 ip-10-0-130-86 kubenswrapper[2534]: I0422 16:28:31.960242 2534 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-z7pk7_8781a272-57f8-42fd-84df-15814bf56a2a/tuned/0.log"