Apr 24 23:51:13.789057 ip-10-0-138-5 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 24 23:51:13.789070 ip-10-0-138-5 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 24 23:51:13.789079 ip-10-0-138-5 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 24 23:51:13.789373 ip-10-0-138-5 systemd[1]: Failed to start Kubernetes Kubelet. Apr 24 23:51:23.893380 ip-10-0-138-5 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 24 23:51:23.893398 ip-10-0-138-5 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8cf7fde49efe41b0bc0d28bd3447b084 -- Apr 24 23:53:47.099250 ip-10-0-138-5 systemd[1]: Starting Kubernetes Kubelet... Apr 24 23:53:47.571261 ip-10-0-138-5 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:47.571261 ip-10-0-138-5 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 23:53:47.571261 ip-10-0-138-5 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:47.571261 ip-10-0-138-5 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 23:53:47.571261 ip-10-0-138-5 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 23:53:47.572829 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.572682 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 23:53:47.576162 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576144 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.576162 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576162 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576166 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576170 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576173 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576178 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576183 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576187 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576191 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576194 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576197 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576200 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576202 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576205 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576208 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576211 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576213 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576216 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576219 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576221 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.576234 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576224 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576227 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576230 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576233 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576236 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576239 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576241 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576251 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576254 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576257 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576260 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576264 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576268 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576270 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576273 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576275 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576278 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576281 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576283 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576286 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.576719 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576288 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576291 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576294 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576297 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576300 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576303 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576306 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576309 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576311 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576314 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576317 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576319 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576322 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576324 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576327 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576330 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576333 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576335 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576338 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576341 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.577217 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576343 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576346 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576349 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576353 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576355 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576358 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576360 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576363 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576366 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576368 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576370 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576373 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576376 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576378 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576381 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576384 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576386 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576390 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576392 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576394 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.577708 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576397 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576399 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576402 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576405 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576407 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576410 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576842 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576848 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576852 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576855 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576858 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576861 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576864 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576867 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576870 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576873 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576875 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576878 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576881 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.578204 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576883 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576886 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576890 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576893 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576895 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576898 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576901 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576919 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576924 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576928 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576932 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576935 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576938 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576941 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576944 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576947 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576950 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576954 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576958 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576961 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.578750 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576964 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576967 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576970 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576973 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576976 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576979 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576981 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576984 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576986 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576989 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576991 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576994 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576996 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.576999 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577002 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577005 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577007 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577010 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577014 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.579313 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577017 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577020 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577024 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577026 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577029 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577031 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577034 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577036 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577040 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577043 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577046 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577049 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577052 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577054 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577057 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577059 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577062 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577064 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577067 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577070 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.579795 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577072 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577075 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577077 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577080 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577083 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577085 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577087 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577090 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577092 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577095 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577097 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577100 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577102 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577105 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577185 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577197 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577206 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577212 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577219 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577225 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577231 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 23:53:47.580312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577236 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577239 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577243 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577246 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577249 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577252 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577255 2572 flags.go:64] FLAG: --cgroup-root="" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577259 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577262 2572 flags.go:64] FLAG: --client-ca-file="" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577265 2572 flags.go:64] FLAG: --cloud-config="" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577268 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577271 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577280 2572 flags.go:64] FLAG: --cluster-domain="" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577283 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577286 2572 flags.go:64] FLAG: --config-dir="" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577289 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577293 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577298 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577301 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577304 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577307 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577311 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577314 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577317 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577320 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 23:53:47.580827 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577323 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577328 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577331 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577334 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577338 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577341 2572 flags.go:64] FLAG: --enable-server="true" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577345 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577349 2572 flags.go:64] FLAG: --event-burst="100" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577352 2572 flags.go:64] FLAG: --event-qps="50" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577355 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577359 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577362 2572 flags.go:64] FLAG: --eviction-hard="" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577366 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577369 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577372 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577375 2572 flags.go:64] FLAG: --eviction-soft="" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577378 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577384 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577388 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577391 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577394 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577397 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577400 2572 flags.go:64] FLAG: --feature-gates="" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577405 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577408 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 23:53:47.581448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577411 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577415 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577419 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577422 2572 flags.go:64] FLAG: --help="false" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577425 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577428 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577432 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577435 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577438 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577442 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577445 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577449 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577452 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577455 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577458 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577461 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577464 2572 flags.go:64] FLAG: --kube-reserved="" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577468 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577471 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577474 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577477 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577480 2572 flags.go:64] FLAG: --lock-file="" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577483 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577487 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 23:53:47.582068 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577491 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577497 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577500 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577503 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577506 2572 flags.go:64] FLAG: --logging-format="text" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577509 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577513 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577516 2572 flags.go:64] FLAG: --manifest-url="" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577519 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577523 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577526 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577531 2572 flags.go:64] FLAG: --max-pods="110" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577534 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577537 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577540 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577543 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577547 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577549 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577552 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577561 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577565 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577568 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577571 2572 flags.go:64] FLAG: --pod-cidr="" Apr 24 23:53:47.582644 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577574 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577580 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577583 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577587 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577590 2572 flags.go:64] FLAG: --port="10250" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577593 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577596 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0f92e9fdd9018b289" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577600 2572 flags.go:64] FLAG: --qos-reserved="" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577602 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577607 2572 flags.go:64] FLAG: --register-node="true" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577610 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577613 2572 flags.go:64] FLAG: --register-with-taints="" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577616 2572 flags.go:64] FLAG: --registry-burst="10" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577619 2572 flags.go:64] FLAG: --registry-qps="5" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577622 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577625 2572 flags.go:64] FLAG: --reserved-memory="" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577629 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577632 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577635 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577638 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577642 2572 flags.go:64] FLAG: --runonce="false" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577645 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577648 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577651 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577654 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577658 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 23:53:47.583216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577661 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577664 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577667 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577672 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577675 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577679 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577682 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577685 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577688 2572 flags.go:64] FLAG: --system-cgroups="" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577691 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577697 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577700 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577703 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577708 2572 flags.go:64] FLAG: --tls-min-version="" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577711 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577718 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577722 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577725 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577728 2572 flags.go:64] FLAG: --v="2" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577733 2572 flags.go:64] FLAG: --version="false" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577737 2572 flags.go:64] FLAG: --vmodule="" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577741 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.577745 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577843 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.583831 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577847 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577850 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577853 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577856 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577859 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577862 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577865 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577868 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577871 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577874 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577877 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577881 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577886 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577890 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577894 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577897 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577901 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577918 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577922 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577926 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.584418 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577931 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577934 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577937 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577940 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577943 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577946 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577948 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577951 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577954 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577957 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577959 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577962 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577965 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577968 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577970 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577973 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577977 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577981 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577984 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.584965 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577987 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577990 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577993 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.577995 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578000 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578002 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578006 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578008 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578011 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578014 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578017 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578019 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578022 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578024 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578027 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578030 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578032 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578035 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578038 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.585416 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578040 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578043 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578045 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578048 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578050 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578053 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578056 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578058 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578061 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578063 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578066 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578068 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578071 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578073 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578076 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578079 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578081 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578086 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578089 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578092 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.585889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578095 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578097 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578100 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578103 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578105 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578107 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.578110 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.579059 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.585817 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.585835 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585886 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585892 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585896 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585899 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585902 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.586391 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585918 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585921 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585924 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585927 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585930 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585932 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585935 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585938 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585941 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585944 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585946 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585949 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585952 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585955 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585957 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585960 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585962 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585967 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585969 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585972 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.586785 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585975 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585978 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585981 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585984 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585986 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585989 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585992 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585995 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.585997 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586000 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586003 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586006 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586008 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586011 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586014 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586017 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586019 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586022 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586026 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.587288 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586031 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586034 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586037 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586040 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586043 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586046 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586048 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586051 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586054 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586057 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586059 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586062 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586065 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586068 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586070 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586073 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586076 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586079 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586081 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586084 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.587752 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586087 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586090 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586092 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586095 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586097 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586100 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586102 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586105 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586108 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586110 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586113 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586115 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586118 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586120 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586123 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586127 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586132 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586135 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586138 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.588282 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586141 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586144 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586146 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.586151 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586254 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586259 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586262 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586266 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586268 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586271 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586274 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586277 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586280 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586283 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586286 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 23:53:47.588735 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586289 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586291 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586294 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586296 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586299 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586302 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586304 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586307 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586309 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586312 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586314 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586317 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586320 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586322 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586325 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586327 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586330 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586332 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586335 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586337 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 23:53:47.589123 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586340 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586342 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586345 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586347 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586350 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586353 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586356 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586358 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586361 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586364 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586366 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586369 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586371 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586374 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586377 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586379 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586382 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586384 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586387 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586389 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 23:53:47.589611 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586392 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586395 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586397 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586400 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586403 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586405 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586408 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586411 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586413 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586415 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586418 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586420 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586423 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586426 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586428 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586431 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586434 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586437 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586440 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586442 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 23:53:47.590117 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586445 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586447 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586450 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586453 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586455 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586459 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586463 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586466 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586469 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586472 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586475 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586477 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586480 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586482 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:47.586485 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 23:53:47.590604 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.586490 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 23:53:47.590987 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.587340 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 23:53:47.590987 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.590352 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 23:53:47.593390 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.593377 2572 server.go:1019] "Starting client certificate rotation" Apr 24 23:53:47.593497 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.593481 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:47.593532 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.593523 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 23:53:47.619112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.619084 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:47.625442 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.625415 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 23:53:47.639748 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.639725 2572 log.go:25] "Validated CRI v1 runtime API" Apr 24 23:53:47.645540 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.645522 2572 log.go:25] "Validated CRI v1 image API" Apr 24 23:53:47.646792 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.646777 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 23:53:47.654024 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.653995 2572 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 915a8d06-bed4-45ae-b333-ece45b9553a0:/dev/nvme0n1p4 e34da326-3c8c-43b7-9ff2-9b0aeda65ea5:/dev/nvme0n1p3] Apr 24 23:53:47.654105 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.654023 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 23:53:47.655145 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.655125 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:47.659312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.659185 2572 manager.go:217] Machine: {Timestamp:2026-04-24 23:53:47.657979232 +0000 UTC m=+0.432412100 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100332 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28807cb187e97bba7a6f8a410de9ad SystemUUID:ec28807c-b187-e97b-ba7a-6f8a410de9ad BootID:8cf7fde4-9efe-41b0-bc0d-28bd3447b084 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c7:93:f2:25:33 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c7:93:f2:25:33 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:82:f8:f2:ba:ce:b7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 23:53:47.659312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.659307 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 23:53:47.659416 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.659400 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 23:53:47.661087 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.661059 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 23:53:47.661275 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.661090 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-5.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 23:53:47.661361 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.661288 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 23:53:47.661361 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.661298 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 23:53:47.661361 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.661311 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:47.661361 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.661323 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 23:53:47.662649 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.662635 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:47.662769 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.662759 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 23:53:47.666478 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.666465 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 24 23:53:47.666536 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.666488 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 23:53:47.666536 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.666511 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 23:53:47.666536 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.666521 2572 kubelet.go:397] "Adding apiserver pod source" Apr 24 23:53:47.666536 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.666530 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 23:53:47.669609 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.669589 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:47.669682 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.669620 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 23:53:47.674685 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.674653 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 23:53:47.676133 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.676119 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 23:53:47.677581 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677567 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 23:53:47.677631 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677591 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 23:53:47.677631 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677600 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 23:53:47.677631 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677606 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 23:53:47.677631 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677612 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 23:53:47.677631 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677618 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 23:53:47.677631 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677624 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 23:53:47.677631 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677629 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 23:53:47.677825 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677637 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 23:53:47.677825 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677646 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 23:53:47.677825 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677657 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 23:53:47.677825 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.677666 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 23:53:47.678717 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.678705 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 23:53:47.678760 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.678720 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 23:53:47.682688 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.682657 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-5.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 23:53:47.682805 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.682685 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 23:53:47.682805 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.682700 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 23:53:47.682805 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.682734 2572 server.go:1295] "Started kubelet" Apr 24 23:53:47.682805 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.682751 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 23:53:47.683021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.682817 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 23:53:47.683021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.682875 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 23:53:47.683021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.682880 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 23:53:47.683557 ip-10-0-138-5 systemd[1]: Started Kubernetes Kubelet. Apr 24 23:53:47.684096 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.683988 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-57hdc" Apr 24 23:53:47.684234 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.684220 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 23:53:47.685589 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.685574 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 24 23:53:47.690437 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.690416 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 23:53:47.690437 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.690426 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:47.691100 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.691081 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 23:53:47.691241 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.691223 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 23:53:47.691312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.691295 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 23:53:47.691373 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.691363 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 24 23:53:47.691408 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.691375 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 24 23:53:47.691519 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.691504 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-57hdc" Apr 24 23:53:47.691620 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.691598 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:47.692412 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.691148 2572 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-5.ec2.internal.18a97023f35101c8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-5.ec2.internal,UID:ip-10-0-138-5.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-5.ec2.internal,},FirstTimestamp:2026-04-24 23:53:47.68270996 +0000 UTC m=+0.457142835,LastTimestamp:2026-04-24 23:53:47.68270996 +0000 UTC m=+0.457142835,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-5.ec2.internal,}" Apr 24 23:53:47.692805 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.692788 2572 factory.go:55] Registering systemd factory Apr 24 23:53:47.692920 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.692888 2572 factory.go:223] Registration of the systemd container factory successfully Apr 24 23:53:47.693213 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.693196 2572 factory.go:153] Registering CRI-O factory Apr 24 23:53:47.693213 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.693215 2572 factory.go:223] Registration of the crio container factory successfully Apr 24 23:53:47.693324 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.693279 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 23:53:47.693324 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.693303 2572 factory.go:103] Registering Raw factory Apr 24 23:53:47.693324 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.693317 2572 manager.go:1196] Started watching for new ooms in manager Apr 24 23:53:47.694148 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.694129 2572 manager.go:319] Starting recovery of all containers Apr 24 23:53:47.696373 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.696347 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 23:53:47.696696 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.696643 2572 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-5.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 23:53:47.697189 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.697168 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 23:53:47.707874 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.707745 2572 manager.go:324] Recovery completed Apr 24 23:53:47.712232 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.712218 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.714825 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.714807 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.714924 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.714838 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.714924 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.714852 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.715447 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.715431 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 23:53:47.715447 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.715445 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 23:53:47.715546 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.715461 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 24 23:53:47.718773 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.718760 2572 policy_none.go:49] "None policy: Start" Apr 24 23:53:47.718809 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.718778 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 23:53:47.718809 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.718788 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.761690 2572 manager.go:341] "Starting Device Plugin manager" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.761727 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.761736 2572 server.go:85] "Starting device plugin registration server" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.762001 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.762013 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.762123 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.762245 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.762253 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.762629 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 23:53:47.778662 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.762674 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:47.821431 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.821343 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 23:53:47.822612 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.822591 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 23:53:47.822696 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.822624 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 23:53:47.822748 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.822698 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 23:53:47.822748 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.822710 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 23:53:47.822837 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.822752 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 23:53:47.825309 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.825289 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:47.862517 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.862466 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.865280 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.865254 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.865355 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.865288 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.865355 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.865305 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.865355 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.865332 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.871306 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.871287 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.871355 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.871316 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-5.ec2.internal\": node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:47.886194 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.886172 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:47.922986 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.922955 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal"] Apr 24 23:53:47.923060 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.923034 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.924025 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.924010 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.924088 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.924041 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.924088 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.924054 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.926465 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.926451 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.926625 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.926611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.926674 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.926638 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.927236 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.927211 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.927337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.927246 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.927337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.927257 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.927337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.927214 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.927337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.927313 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.927337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.927325 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.929496 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.929476 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.929596 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.929508 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 23:53:47.930147 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.930133 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientMemory" Apr 24 23:53:47.930220 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.930163 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 23:53:47.930220 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.930176 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeHasSufficientPID" Apr 24 23:53:47.962825 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.962795 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-5.ec2.internal\" not found" node="ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.967592 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.967573 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-5.ec2.internal\" not found" node="ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.986426 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:47.986406 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:47.992711 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.992690 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e99259f9405d85ee079d839cce796346-config\") pod \"kube-apiserver-proxy-ip-10-0-138-5.ec2.internal\" (UID: \"e99259f9405d85ee079d839cce796346\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.992774 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.992715 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e687fa430eec89daa3a0524c1f1c3729-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal\" (UID: \"e687fa430eec89daa3a0524c1f1c3729\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:47.992774 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:47.992734 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e687fa430eec89daa3a0524c1f1c3729-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal\" (UID: \"e687fa430eec89daa3a0524c1f1c3729\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.086765 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.086684 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.093023 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.093000 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e687fa430eec89daa3a0524c1f1c3729-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal\" (UID: \"e687fa430eec89daa3a0524c1f1c3729\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.093101 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.093034 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e687fa430eec89daa3a0524c1f1c3729-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal\" (UID: \"e687fa430eec89daa3a0524c1f1c3729\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.093101 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.093051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e99259f9405d85ee079d839cce796346-config\") pod \"kube-apiserver-proxy-ip-10-0-138-5.ec2.internal\" (UID: \"e99259f9405d85ee079d839cce796346\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.093101 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.093092 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e99259f9405d85ee079d839cce796346-config\") pod \"kube-apiserver-proxy-ip-10-0-138-5.ec2.internal\" (UID: \"e99259f9405d85ee079d839cce796346\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.093234 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.093122 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e687fa430eec89daa3a0524c1f1c3729-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal\" (UID: \"e687fa430eec89daa3a0524c1f1c3729\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.093234 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.093174 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e687fa430eec89daa3a0524c1f1c3729-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal\" (UID: \"e687fa430eec89daa3a0524c1f1c3729\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.187420 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.187372 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.265934 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.265878 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.269483 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.269460 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:48.288135 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.288107 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.388793 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.388693 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.489187 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.489154 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.589807 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.589773 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.593069 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.593048 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 23:53:48.593238 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.593211 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 23:53:48.690614 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.690588 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.690614 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.690597 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 23:53:48.694077 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.694023 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 23:48:47 +0000 UTC" deadline="2027-12-10 15:02:30.963049171 +0000 UTC" Apr 24 23:53:48.694077 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.694066 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14271h8m42.268987579s" Apr 24 23:53:48.700692 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.700669 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 23:53:48.712866 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.712844 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:48.721942 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.721901 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-6tkv8" Apr 24 23:53:48.729411 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.729392 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-6tkv8" Apr 24 23:53:48.768241 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:48.768203 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode687fa430eec89daa3a0524c1f1c3729.slice/crio-8ecd844f026f34d650ba898a83224cf1050250a43e0c2bc4c037693cec8ee324 WatchSource:0}: Error finding container 8ecd844f026f34d650ba898a83224cf1050250a43e0c2bc4c037693cec8ee324: Status 404 returned error can't find the container with id 8ecd844f026f34d650ba898a83224cf1050250a43e0c2bc4c037693cec8ee324 Apr 24 23:53:48.768479 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:48.768457 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode99259f9405d85ee079d839cce796346.slice/crio-b3b2d1731da9afad54f8edd8e388cbeb7ed2283f787a131e06351249afea7b71 WatchSource:0}: Error finding container b3b2d1731da9afad54f8edd8e388cbeb7ed2283f787a131e06351249afea7b71: Status 404 returned error can't find the container with id b3b2d1731da9afad54f8edd8e388cbeb7ed2283f787a131e06351249afea7b71 Apr 24 23:53:48.772400 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.772387 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 23:53:48.790800 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.790771 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.826126 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.826069 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" event={"ID":"e99259f9405d85ee079d839cce796346","Type":"ContainerStarted","Data":"b3b2d1731da9afad54f8edd8e388cbeb7ed2283f787a131e06351249afea7b71"} Apr 24 23:53:48.827006 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.826982 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" event={"ID":"e687fa430eec89daa3a0524c1f1c3729","Type":"ContainerStarted","Data":"8ecd844f026f34d650ba898a83224cf1050250a43e0c2bc4c037693cec8ee324"} Apr 24 23:53:48.890230 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:48.890201 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:48.891651 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.891633 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:48.992286 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:48.992195 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:49.092716 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.092648 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:49.193518 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.193485 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-5.ec2.internal\" not found" Apr 24 23:53:49.261735 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.261648 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:49.292003 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.291736 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" Apr 24 23:53:49.302971 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.302940 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:49.303959 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.303933 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" Apr 24 23:53:49.313323 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.313299 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 23:53:49.668305 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.668225 2572 apiserver.go:52] "Watching apiserver" Apr 24 23:53:49.676382 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.676355 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 23:53:49.676751 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.676727 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-xgtzt","openshift-network-operator/iptables-alerter-87xm9","openshift-ovn-kubernetes/ovnkube-node-9rpfg","kube-system/konnectivity-agent-g9jvl","kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq","openshift-cluster-node-tuning-operator/tuned-tcsxn","openshift-dns/node-resolver-bcdmk","openshift-multus/multus-65fxs","openshift-network-diagnostics/network-check-target-5gffj","openshift-image-registry/node-ca-vs22k","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal","openshift-multus/multus-additional-cni-plugins-qs259"] Apr 24 23:53:49.682447 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.682413 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.684854 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.684452 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 23:53:49.684854 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.684486 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.684854 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.684525 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-pcwkl\"" Apr 24 23:53:49.684854 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.684684 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.684854 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.684741 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 23:53:49.685226 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.685191 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 23:53:49.685226 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.685219 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 23:53:49.686833 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.686814 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:49.686949 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.686872 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gfckq\"" Apr 24 23:53:49.687060 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 23:53:49.687126 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687065 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:49.687182 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687043 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 23:53:49.687869 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687848 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 23:53:49.687983 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687898 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 23:53:49.687983 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687928 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 23:53:49.687983 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687942 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 23:53:49.688135 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687854 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-vpkzx\"" Apr 24 23:53:49.688135 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.687854 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 23:53:49.689029 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.689007 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:49.689189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.689172 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.691042 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.691024 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-w78bb\"" Apr 24 23:53:49.691231 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.691218 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 23:53:49.691342 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.691321 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 23:53:49.691451 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.691372 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.691451 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.691392 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wf2j9\"" Apr 24 23:53:49.691693 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.691576 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 23:53:49.691898 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.691880 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 23:53:49.693226 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.693175 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 23:53:49.693381 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.693365 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 23:53:49.693381 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.693375 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 23:53:49.693499 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.693487 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-rx27f\"" Apr 24 23:53:49.693841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.693820 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:49.693973 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.693947 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:53:49.696235 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.696216 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.698259 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.698239 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-rgfxr\"" Apr 24 23:53:49.698350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.698302 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 23:53:49.698350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.698309 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 23:53:49.698350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.698324 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 23:53:49.698658 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.698642 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.700872 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.700853 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8j9vs\"" Apr 24 23:53:49.700976 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.700884 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 23:53:49.701521 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.701501 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 23:53:49.702119 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702095 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-run-netns\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.702252 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-cni-bin\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.702309 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702275 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-log-socket\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.702357 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702315 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.702407 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702352 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-hostroot\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.702474 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702384 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9xpj\" (UniqueName: \"kubernetes.io/projected/0818dbcb-a498-4a49-8ca5-0b677796b068-kube-api-access-k9xpj\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.702540 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702502 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-kubelet\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.702540 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-systemd-units\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.702635 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-ovn\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.702635 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702605 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-etc-selinux\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.702722 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702712 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-system-cni-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.702767 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/79016484-eae2-4542-8926-e0955b9dfe90-iptables-alerter-script\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.702818 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702779 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cnibin\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.702863 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702821 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-registration-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.702863 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-cnibin\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.702974 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.702885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-cni-multus\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.703097 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703078 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-run-ovn-kubernetes\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.703149 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703124 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovnkube-script-lib\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.703250 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703156 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrzp6\" (UniqueName: \"kubernetes.io/projected/bbd086d5-cca3-4b01-aa4c-f76f49619285-kube-api-access-lrzp6\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.703300 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703272 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-cni-bin\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.703350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703304 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-systemd\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.703350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703335 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-env-overrides\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.703450 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-etc-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.703500 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703467 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tknv\" (UniqueName: \"kubernetes.io/projected/c23cdd8c-e99e-473b-acb6-6602cadc65a1-kube-api-access-5tknv\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.703549 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703496 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-daemon-config\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.703598 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703577 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:49.703651 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703607 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-os-release\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.703651 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703632 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0818dbcb-a498-4a49-8ca5-0b677796b068-cni-binary-copy\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.703738 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703692 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-k8s-cni-cncf-io\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.703792 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-multus-certs\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.703843 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703792 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5lrk\" (UniqueName: \"kubernetes.io/projected/79016484-eae2-4542-8926-e0955b9dfe90-kube-api-access-c5lrk\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.703843 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703815 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/926c45d5-951c-4d9f-9ec3-07d7ca1a80dc-agent-certs\") pod \"konnectivity-agent-g9jvl\" (UID: \"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc\") " pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:49.703956 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/926c45d5-951c-4d9f-9ec3-07d7ca1a80dc-konnectivity-ca\") pod \"konnectivity-agent-g9jvl\" (UID: \"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc\") " pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:49.703956 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703875 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6t6k\" (UniqueName: \"kubernetes.io/projected/6ce242e4-92d1-4ff1-8276-05d4293cfb10-kube-api-access-z6t6k\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:49.703956 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703921 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-socket-dir-parent\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.704101 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-kubelet\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.704101 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.703983 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-system-cni-dir\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.704101 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704057 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-var-lib-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.704101 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.704279 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.704279 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704155 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.704279 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bhrr\" (UniqueName: \"kubernetes.io/projected/ab1eac84-5f92-48ae-833a-00e1a821cd2e-kube-api-access-9bhrr\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.704279 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704240 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovnkube-config\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.704279 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.704490 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704321 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-device-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.704490 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704468 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-netns\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.704581 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704499 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:49.704581 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704504 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.704581 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704507 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-cni-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.704581 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704573 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-cni-netd\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.704744 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.704575 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:53:49.704744 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704608 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-socket-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.704744 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704639 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79016484-eae2-4542-8926-e0955b9dfe90-host-slash\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.704744 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704673 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.704744 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704733 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.704973 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704770 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-conf-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.704973 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704819 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-etc-kubernetes\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.704973 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-node-log\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.704973 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-slash\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.705142 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.704991 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovn-node-metrics-cert\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.705142 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.705035 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-sys-fs\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.705142 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.705065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-os-release\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.706664 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.706645 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-qc7t9\"" Apr 24 23:53:49.706894 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.706876 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 23:53:49.706894 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.706886 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 23:53:49.730287 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.730260 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:48 +0000 UTC" deadline="2027-09-23 05:32:01.744228664 +0000 UTC" Apr 24 23:53:49.730287 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.730285 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12389h38m12.013945446s" Apr 24 23:53:49.792343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.792316 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 23:53:49.806243 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-run-ovn-kubernetes\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806357 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovnkube-script-lib\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806357 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806281 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j74lz\" (UniqueName: \"kubernetes.io/projected/9646a754-da93-4e1f-9571-2b775195390b-kube-api-access-j74lz\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.806357 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806305 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-modprobe-d\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.806357 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-run-ovn-kubernetes\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806357 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806327 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-var-lib-kubelet\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.806597 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrzp6\" (UniqueName: \"kubernetes.io/projected/bbd086d5-cca3-4b01-aa4c-f76f49619285-kube-api-access-lrzp6\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806597 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806447 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-cni-bin\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.806597 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-systemd\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806597 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806543 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-cni-bin\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.806597 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806553 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-systemd\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806761 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806590 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-env-overrides\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806761 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806621 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-etc-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.806761 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tknv\" (UniqueName: \"kubernetes.io/projected/c23cdd8c-e99e-473b-acb6-6602cadc65a1-kube-api-access-5tknv\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.806761 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806663 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-daemon-config\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.806761 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806689 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:49.806761 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806717 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-lib-modules\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.806761 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-os-release\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806767 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0818dbcb-a498-4a49-8ca5-0b677796b068-cni-binary-copy\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-k8s-cni-cncf-io\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806811 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-multus-certs\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806859 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5lrk\" (UniqueName: \"kubernetes.io/projected/79016484-eae2-4542-8926-e0955b9dfe90-kube-api-access-c5lrk\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806885 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/926c45d5-951c-4d9f-9ec3-07d7ca1a80dc-agent-certs\") pod \"konnectivity-agent-g9jvl\" (UID: \"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc\") " pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806923 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/926c45d5-951c-4d9f-9ec3-07d7ca1a80dc-konnectivity-ca\") pod \"konnectivity-agent-g9jvl\" (UID: \"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc\") " pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806939 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovnkube-script-lib\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806951 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6t6k\" (UniqueName: \"kubernetes.io/projected/6ce242e4-92d1-4ff1-8276-05d4293cfb10-kube-api-access-z6t6k\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806955 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-os-release\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806977 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-socket-dir-parent\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.806689 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-etc-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807005 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-kubelet\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807031 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-system-cni-dir\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-k8s-cni-cncf-io\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807063 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-env-overrides\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807091 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-var-lib-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807131 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-kubelet\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807151 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807190 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-socket-dir-parent\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807207 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-multus-certs\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807208 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807239 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-daemon-config\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-var-lib-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.807265 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807275 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-system-cni-dir\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807288 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807298 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-openvswitch\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.807446 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.807375 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:50.307338365 +0000 UTC m=+3.081771240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bhrr\" (UniqueName: \"kubernetes.io/projected/ab1eac84-5f92-48ae-833a-00e1a821cd2e-kube-api-access-9bhrr\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807584 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovnkube-config\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807597 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807631 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807655 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-device-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-netns\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-cni-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807729 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5bcba16a-2d33-4168-8eae-a6ab55719a08-hosts-file\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807756 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-cni-netd\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807764 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/926c45d5-951c-4d9f-9ec3-07d7ca1a80dc-konnectivity-ca\") pod \"konnectivity-agent-g9jvl\" (UID: \"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc\") " pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0818dbcb-a498-4a49-8ca5-0b677796b068-cni-binary-copy\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807784 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-socket-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807825 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79016484-eae2-4542-8926-e0955b9dfe90-host-slash\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807829 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-run-netns\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.808161 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807854 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5bcba16a-2d33-4168-8eae-a6ab55719a08-tmp-dir\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807876 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-kubernetes\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807887 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-cni-netd\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807897 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-run\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807945 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/79016484-eae2-4542-8926-e0955b9dfe90-host-slash\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807957 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9369f8ba-c07b-4dda-864a-5be415a51468-etc-tuned\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807973 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-cni-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.807988 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808039 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-conf-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-device-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808079 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-etc-kubernetes\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808091 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-socket-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808119 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf5bz\" (UniqueName: \"kubernetes.io/projected/5bcba16a-2d33-4168-8eae-a6ab55719a08-kube-api-access-nf5bz\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808130 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-multus-conf-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-host\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808171 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9369f8ba-c07b-4dda-864a-5be415a51468-tmp\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.808841 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808178 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-etc-kubernetes\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808184 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808227 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-node-log\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808278 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-slash\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808306 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovn-node-metrics-cert\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808314 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-node-log\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808333 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-sys-fs\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808357 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-os-release\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-slash\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808393 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysctl-conf\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808419 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-sys\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808427 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-sys-fs\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808444 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-run-netns\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808361 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovnkube-config\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808471 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-cni-bin\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808484 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cni-binary-copy\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.809435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-run-netns\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808522 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-cni-bin\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808550 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-systemd\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808574 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-log-socket\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808613 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-os-release\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808632 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-log-socket\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808653 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-hostroot\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808684 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9xpj\" (UniqueName: \"kubernetes.io/projected/0818dbcb-a498-4a49-8ca5-0b677796b068-kube-api-access-k9xpj\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-hostroot\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysconfig\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808741 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-kubelet\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808766 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysctl-d\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808793 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-systemd-units\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808800 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-host-kubelet\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808821 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-ovn\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808875 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-run-ovn\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808877 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-etc-selinux\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.810021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808921 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bbd086d5-cca3-4b01-aa4c-f76f49619285-systemd-units\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-system-cni-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808963 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/79016484-eae2-4542-8926-e0955b9dfe90-iptables-alerter-script\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809011 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9646a754-da93-4e1f-9571-2b775195390b-host\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.808990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-etc-selinux\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-system-cni-dir\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9646a754-da93-4e1f-9571-2b775195390b-serviceca\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809055 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809077 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cnibin\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809106 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-registration-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809131 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-cnibin\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809157 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-cni-multus\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809177 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c23cdd8c-e99e-473b-acb6-6602cadc65a1-cnibin\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809185 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsgxp\" (UniqueName: \"kubernetes.io/projected/9369f8ba-c07b-4dda-864a-5be415a51468-kube-api-access-tsgxp\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809233 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-cnibin\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0818dbcb-a498-4a49-8ca5-0b677796b068-host-var-lib-cni-multus\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809276 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ab1eac84-5f92-48ae-833a-00e1a821cd2e-registration-dir\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.810623 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.809471 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/79016484-eae2-4542-8926-e0955b9dfe90-iptables-alerter-script\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.812198 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.812173 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bbd086d5-cca3-4b01-aa4c-f76f49619285-ovn-node-metrics-cert\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.817118 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.817098 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/926c45d5-951c-4d9f-9ec3-07d7ca1a80dc-agent-certs\") pod \"konnectivity-agent-g9jvl\" (UID: \"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc\") " pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:49.821030 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.821003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6t6k\" (UniqueName: \"kubernetes.io/projected/6ce242e4-92d1-4ff1-8276-05d4293cfb10-kube-api-access-z6t6k\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:49.821442 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.821418 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5lrk\" (UniqueName: \"kubernetes.io/projected/79016484-eae2-4542-8926-e0955b9dfe90-kube-api-access-c5lrk\") pod \"iptables-alerter-87xm9\" (UID: \"79016484-eae2-4542-8926-e0955b9dfe90\") " pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:49.821585 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.821563 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bhrr\" (UniqueName: \"kubernetes.io/projected/ab1eac84-5f92-48ae-833a-00e1a821cd2e-kube-api-access-9bhrr\") pod \"aws-ebs-csi-driver-node-g4pzq\" (UID: \"ab1eac84-5f92-48ae-833a-00e1a821cd2e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:49.821773 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.821749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrzp6\" (UniqueName: \"kubernetes.io/projected/bbd086d5-cca3-4b01-aa4c-f76f49619285-kube-api-access-lrzp6\") pod \"ovnkube-node-9rpfg\" (UID: \"bbd086d5-cca3-4b01-aa4c-f76f49619285\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:49.822735 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.822714 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9xpj\" (UniqueName: \"kubernetes.io/projected/0818dbcb-a498-4a49-8ca5-0b677796b068-kube-api-access-k9xpj\") pod \"multus-65fxs\" (UID: \"0818dbcb-a498-4a49-8ca5-0b677796b068\") " pod="openshift-multus/multus-65fxs" Apr 24 23:53:49.822826 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.822758 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tknv\" (UniqueName: \"kubernetes.io/projected/c23cdd8c-e99e-473b-acb6-6602cadc65a1-kube-api-access-5tknv\") pod \"multus-additional-cni-plugins-qs259\" (UID: \"c23cdd8c-e99e-473b-acb6-6602cadc65a1\") " pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:49.909755 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-lib-modules\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.909755 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909764 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5bcba16a-2d33-4168-8eae-a6ab55719a08-hosts-file\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909781 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5bcba16a-2d33-4168-8eae-a6ab55719a08-tmp-dir\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909828 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5bcba16a-2d33-4168-8eae-a6ab55719a08-hosts-file\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909901 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-kubernetes\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909938 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-lib-modules\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909953 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-run\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.909992 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9369f8ba-c07b-4dda-864a-5be415a51468-etc-tuned\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910003 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-run\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910016 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910004 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-kubernetes\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910021 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nf5bz\" (UniqueName: \"kubernetes.io/projected/5bcba16a-2d33-4168-8eae-a6ab55719a08-kube-api-access-nf5bz\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-host\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910068 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9369f8ba-c07b-4dda-864a-5be415a51468-tmp\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910093 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910105 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5bcba16a-2d33-4168-8eae-a6ab55719a08-tmp-dir\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910123 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysctl-conf\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910149 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-sys\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910178 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-systemd\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910206 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysconfig\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysctl-d\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910249 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysctl-conf\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9646a754-da93-4e1f-9571-2b775195390b-host\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910285 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9646a754-da93-4e1f-9571-2b775195390b-serviceca\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-systemd\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910312 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgxp\" (UniqueName: \"kubernetes.io/projected/9369f8ba-c07b-4dda-864a-5be415a51468-kube-api-access-tsgxp\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysconfig\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910124 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-host\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.910388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910362 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9646a754-da93-4e1f-9571-2b775195390b-host\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910390 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j74lz\" (UniqueName: \"kubernetes.io/projected/9646a754-da93-4e1f-9571-2b775195390b-kube-api-access-j74lz\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910435 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-modprobe-d\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910459 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-var-lib-kubelet\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910538 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-var-lib-kubelet\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910645 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-modprobe-d\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910663 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-etc-sysctl-d\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910692 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9369f8ba-c07b-4dda-864a-5be415a51468-sys\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.911079 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.910873 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9646a754-da93-4e1f-9571-2b775195390b-serviceca\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.912339 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.912315 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9369f8ba-c07b-4dda-864a-5be415a51468-tmp\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.912428 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.912375 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9369f8ba-c07b-4dda-864a-5be415a51468-etc-tuned\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.917197 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.917177 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 23:53:49.919817 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.919768 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:49.919817 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.919789 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:49.919817 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.919802 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x444b for pod openshift-network-diagnostics/network-check-target-5gffj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:49.920039 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:49.919918 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b podName:388a768c-7e44-4c31-8196-916e3ba70a82 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:50.419855297 +0000 UTC m=+3.194288153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x444b" (UniqueName: "kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b") pod "network-check-target-5gffj" (UID: "388a768c-7e44-4c31-8196-916e3ba70a82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:49.922228 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.922206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j74lz\" (UniqueName: \"kubernetes.io/projected/9646a754-da93-4e1f-9571-2b775195390b-kube-api-access-j74lz\") pod \"node-ca-vs22k\" (UID: \"9646a754-da93-4e1f-9571-2b775195390b\") " pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:49.922355 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.922333 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsgxp\" (UniqueName: \"kubernetes.io/projected/9369f8ba-c07b-4dda-864a-5be415a51468-kube-api-access-tsgxp\") pod \"tuned-tcsxn\" (UID: \"9369f8ba-c07b-4dda-864a-5be415a51468\") " pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:49.922423 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.922400 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf5bz\" (UniqueName: \"kubernetes.io/projected/5bcba16a-2d33-4168-8eae-a6ab55719a08-kube-api-access-nf5bz\") pod \"node-resolver-bcdmk\" (UID: \"5bcba16a-2d33-4168-8eae-a6ab55719a08\") " pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:49.994452 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:49.994419 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-65fxs" Apr 24 23:53:50.002181 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.002150 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-87xm9" Apr 24 23:53:50.011747 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.011728 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:53:50.016252 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.016234 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qs259" Apr 24 23:53:50.022863 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.022836 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:53:50.029471 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.029450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" Apr 24 23:53:50.034986 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.034969 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-vs22k" Apr 24 23:53:50.042530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.042506 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" Apr 24 23:53:50.049044 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.049019 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bcdmk" Apr 24 23:53:50.313176 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.313112 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:50.313308 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:50.313251 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:50.313346 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:50.313311 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:51.313296236 +0000 UTC m=+4.087729092 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:50.356889 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:50.356857 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc23cdd8c_e99e_473b_acb6_6602cadc65a1.slice/crio-44cac0cf02774a218c666f680521bf96f2155e8fd760c9151ae71c01730cf349 WatchSource:0}: Error finding container 44cac0cf02774a218c666f680521bf96f2155e8fd760c9151ae71c01730cf349: Status 404 returned error can't find the container with id 44cac0cf02774a218c666f680521bf96f2155e8fd760c9151ae71c01730cf349 Apr 24 23:53:50.358507 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:50.358476 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbd086d5_cca3_4b01_aa4c_f76f49619285.slice/crio-4078b86b83b6a68e6cc34c3054492536256b03fde9a6bb96a806e87f373e6ea2 WatchSource:0}: Error finding container 4078b86b83b6a68e6cc34c3054492536256b03fde9a6bb96a806e87f373e6ea2: Status 404 returned error can't find the container with id 4078b86b83b6a68e6cc34c3054492536256b03fde9a6bb96a806e87f373e6ea2 Apr 24 23:53:50.359514 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:50.359491 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79016484_eae2_4542_8926_e0955b9dfe90.slice/crio-dccc77a1b02175c2d9238a8dc052ba70947f7c866b3f9f4ff4aee1ccc2b63f1a WatchSource:0}: Error finding container dccc77a1b02175c2d9238a8dc052ba70947f7c866b3f9f4ff4aee1ccc2b63f1a: Status 404 returned error can't find the container with id dccc77a1b02175c2d9238a8dc052ba70947f7c866b3f9f4ff4aee1ccc2b63f1a Apr 24 23:53:50.360453 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:50.360421 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9646a754_da93_4e1f_9571_2b775195390b.slice/crio-55962e1123ebc5f01cb477de014a50f184aa7b03dd6404abd49a1be865b3c8d8 WatchSource:0}: Error finding container 55962e1123ebc5f01cb477de014a50f184aa7b03dd6404abd49a1be865b3c8d8: Status 404 returned error can't find the container with id 55962e1123ebc5f01cb477de014a50f184aa7b03dd6404abd49a1be865b3c8d8 Apr 24 23:53:50.362972 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:50.362900 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab1eac84_5f92_48ae_833a_00e1a821cd2e.slice/crio-7f71c86422a24e95e45ff087b0bb84e9b8a991cd022720cbe635be78bf0c7a89 WatchSource:0}: Error finding container 7f71c86422a24e95e45ff087b0bb84e9b8a991cd022720cbe635be78bf0c7a89: Status 404 returned error can't find the container with id 7f71c86422a24e95e45ff087b0bb84e9b8a991cd022720cbe635be78bf0c7a89 Apr 24 23:53:50.366420 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:50.366398 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod926c45d5_951c_4d9f_9ec3_07d7ca1a80dc.slice/crio-11075124db6b97cf7fdbfceb07349e0f96e7cc6e79ede470b6c2231ccc3d66d8 WatchSource:0}: Error finding container 11075124db6b97cf7fdbfceb07349e0f96e7cc6e79ede470b6c2231ccc3d66d8: Status 404 returned error can't find the container with id 11075124db6b97cf7fdbfceb07349e0f96e7cc6e79ede470b6c2231ccc3d66d8 Apr 24 23:53:50.367644 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:53:50.367622 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9369f8ba_c07b_4dda_864a_5be415a51468.slice/crio-443dd832ecdbc6ff8603f49f60afdc8e830905a5b8e24b12fe648addc03379bf WatchSource:0}: Error finding container 443dd832ecdbc6ff8603f49f60afdc8e830905a5b8e24b12fe648addc03379bf: Status 404 returned error can't find the container with id 443dd832ecdbc6ff8603f49f60afdc8e830905a5b8e24b12fe648addc03379bf Apr 24 23:53:50.514536 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.514304 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:50.514713 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:50.514471 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:50.514713 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:50.514605 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:50.514713 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:50.514619 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x444b for pod openshift-network-diagnostics/network-check-target-5gffj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:50.514713 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:50.514679 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b podName:388a768c-7e44-4c31-8196-916e3ba70a82 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:51.514660267 +0000 UTC m=+4.289093128 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x444b" (UniqueName: "kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b") pod "network-check-target-5gffj" (UID: "388a768c-7e44-4c31-8196-916e3ba70a82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:50.731055 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.730890 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 23:48:48 +0000 UTC" deadline="2028-01-02 17:45:07.752337098 +0000 UTC" Apr 24 23:53:50.731055 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.730945 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14825h51m17.021396903s" Apr 24 23:53:50.837326 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.837259 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vs22k" event={"ID":"9646a754-da93-4e1f-9571-2b775195390b","Type":"ContainerStarted","Data":"55962e1123ebc5f01cb477de014a50f184aa7b03dd6404abd49a1be865b3c8d8"} Apr 24 23:53:50.839054 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.838984 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"4078b86b83b6a68e6cc34c3054492536256b03fde9a6bb96a806e87f373e6ea2"} Apr 24 23:53:50.849478 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.849441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerStarted","Data":"44cac0cf02774a218c666f680521bf96f2155e8fd760c9151ae71c01730cf349"} Apr 24 23:53:50.852242 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.852210 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" event={"ID":"e99259f9405d85ee079d839cce796346","Type":"ContainerStarted","Data":"6ae8f17d4c37ca0dad35ced12fc90f85c3fd24ec78dabff6862bd184a687b473"} Apr 24 23:53:50.857721 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.857691 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-65fxs" event={"ID":"0818dbcb-a498-4a49-8ca5-0b677796b068","Type":"ContainerStarted","Data":"9981a3b6537749da249be5aa9ef0501dff4d92e9ba5783edc3832aa605c0be31"} Apr 24 23:53:50.867464 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.867406 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-5.ec2.internal" podStartSLOduration=1.867386299 podStartE2EDuration="1.867386299s" podCreationTimestamp="2026-04-24 23:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:50.865454883 +0000 UTC m=+3.639887763" watchObservedRunningTime="2026-04-24 23:53:50.867386299 +0000 UTC m=+3.641819179" Apr 24 23:53:50.869433 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.869401 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" event={"ID":"ab1eac84-5f92-48ae-833a-00e1a821cd2e","Type":"ContainerStarted","Data":"7f71c86422a24e95e45ff087b0bb84e9b8a991cd022720cbe635be78bf0c7a89"} Apr 24 23:53:50.879118 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.879082 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" event={"ID":"9369f8ba-c07b-4dda-864a-5be415a51468","Type":"ContainerStarted","Data":"443dd832ecdbc6ff8603f49f60afdc8e830905a5b8e24b12fe648addc03379bf"} Apr 24 23:53:50.889946 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.889679 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g9jvl" event={"ID":"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc","Type":"ContainerStarted","Data":"11075124db6b97cf7fdbfceb07349e0f96e7cc6e79ede470b6c2231ccc3d66d8"} Apr 24 23:53:50.893315 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.893278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bcdmk" event={"ID":"5bcba16a-2d33-4168-8eae-a6ab55719a08","Type":"ContainerStarted","Data":"d01363ad1722316838207d8be8b8d2e11190be93bcff99f88f43363baba2ce7d"} Apr 24 23:53:50.895328 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:50.895300 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-87xm9" event={"ID":"79016484-eae2-4542-8926-e0955b9dfe90","Type":"ContainerStarted","Data":"dccc77a1b02175c2d9238a8dc052ba70947f7c866b3f9f4ff4aee1ccc2b63f1a"} Apr 24 23:53:51.318395 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:51.318352 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:51.318636 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.318551 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:51.318636 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.318620 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:53.318599304 +0000 UTC m=+6.093032173 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:51.520514 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:51.520411 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:51.520694 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.520587 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:51.520694 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.520606 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:51.520694 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.520618 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x444b for pod openshift-network-diagnostics/network-check-target-5gffj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:51.520851 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.520681 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b podName:388a768c-7e44-4c31-8196-916e3ba70a82 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:53.520660605 +0000 UTC m=+6.295093479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x444b" (UniqueName: "kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b") pod "network-check-target-5gffj" (UID: "388a768c-7e44-4c31-8196-916e3ba70a82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:51.823697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:51.823128 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:51.823697 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.823273 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:53:51.823697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:51.823351 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:51.823697 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:51.823452 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:53:51.923979 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:51.922331 2572 generic.go:358] "Generic (PLEG): container finished" podID="e687fa430eec89daa3a0524c1f1c3729" containerID="0b7a6274f130d8bdd2711c478c5b85df7cafc7ae4f90e4b419267ff2761f13b9" exitCode=0 Apr 24 23:53:51.923979 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:51.922967 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" event={"ID":"e687fa430eec89daa3a0524c1f1c3729","Type":"ContainerDied","Data":"0b7a6274f130d8bdd2711c478c5b85df7cafc7ae4f90e4b419267ff2761f13b9"} Apr 24 23:53:52.929059 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:52.929024 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" event={"ID":"e687fa430eec89daa3a0524c1f1c3729","Type":"ContainerStarted","Data":"0c4fdceb90ebd7b95abf2425d18dec69fd79dac91c56f6c069544ee2d4037f0d"} Apr 24 23:53:53.333215 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:53.333174 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:53.333451 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.333402 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:53.333519 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.333466 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:57.333447578 +0000 UTC m=+10.107880440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:53.535318 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:53.534582 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:53.535318 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.534787 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:53.535318 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.534805 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:53.535318 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.534817 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x444b for pod openshift-network-diagnostics/network-check-target-5gffj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:53.535318 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.534873 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b podName:388a768c-7e44-4c31-8196-916e3ba70a82 nodeName:}" failed. No retries permitted until 2026-04-24 23:53:57.534853787 +0000 UTC m=+10.309286646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x444b" (UniqueName: "kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b") pod "network-check-target-5gffj" (UID: "388a768c-7e44-4c31-8196-916e3ba70a82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:53.823845 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:53.823810 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:53.824013 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.823978 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:53:53.824405 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:53.824386 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:53.824547 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:53.824527 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:53:55.823941 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:55.823862 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:55.823941 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:55.823886 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:55.824473 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:55.824019 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:53:55.824539 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:55.824517 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:53:57.369103 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:57.368491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:57.369103 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.368676 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:57.369103 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.368741 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:05.368722537 +0000 UTC m=+18.143155396 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:53:57.570149 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:57.570113 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:57.570334 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.570310 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:53:57.570387 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.570336 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:53:57.570387 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.570349 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x444b for pod openshift-network-diagnostics/network-check-target-5gffj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:57.570472 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.570415 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b podName:388a768c-7e44-4c31-8196-916e3ba70a82 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:05.570395634 +0000 UTC m=+18.344828497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x444b" (UniqueName: "kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b") pod "network-check-target-5gffj" (UID: "388a768c-7e44-4c31-8196-916e3ba70a82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:53:57.825023 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:57.824409 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:57.825023 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.824531 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:53:57.825023 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:57.824586 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:57.825023 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:57.824651 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:53:59.776834 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.776775 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-5.ec2.internal" podStartSLOduration=10.776738447 podStartE2EDuration="10.776738447s" podCreationTimestamp="2026-04-24 23:53:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:53:52.946078195 +0000 UTC m=+5.720511075" watchObservedRunningTime="2026-04-24 23:53:59.776738447 +0000 UTC m=+12.551171306" Apr 24 23:53:59.777696 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.777674 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xdtfx"] Apr 24 23:53:59.795216 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.795186 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.795374 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:59.795270 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:53:59.823424 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.823382 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:53:59.823567 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.823388 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:53:59.823567 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:59.823495 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:53:59.823685 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:59.823595 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:53:59.890152 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.890104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.890300 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.890176 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f82ef8db-31f3-4480-973b-af1f4b6e810e-dbus\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.890300 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.890214 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f82ef8db-31f3-4480-973b-af1f4b6e810e-kubelet-config\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.990837 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.990797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.991035 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.990881 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f82ef8db-31f3-4480-973b-af1f4b6e810e-dbus\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.991035 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.990934 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f82ef8db-31f3-4480-973b-af1f4b6e810e-kubelet-config\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.991035 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:59.990961 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:59.991035 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:53:59.991034 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret podName:f82ef8db-31f3-4480-973b-af1f4b6e810e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:00.491014578 +0000 UTC m=+13.265447436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret") pod "global-pull-secret-syncer-xdtfx" (UID: "f82ef8db-31f3-4480-973b-af1f4b6e810e") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:53:59.991225 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.991039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f82ef8db-31f3-4480-973b-af1f4b6e810e-kubelet-config\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:53:59.991225 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:53:59.991087 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f82ef8db-31f3-4480-973b-af1f4b6e810e-dbus\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:00.495211 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:00.495176 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:00.495369 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:00.495318 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:00.495421 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:00.495377 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret podName:f82ef8db-31f3-4480-973b-af1f4b6e810e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:01.495363519 +0000 UTC m=+14.269796376 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret") pod "global-pull-secret-syncer-xdtfx" (UID: "f82ef8db-31f3-4480-973b-af1f4b6e810e") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.501867 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:01.501834 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:01.502317 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:01.501995 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.502317 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:01.502056 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret podName:f82ef8db-31f3-4480-973b-af1f4b6e810e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:03.502039073 +0000 UTC m=+16.276471945 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret") pod "global-pull-secret-syncer-xdtfx" (UID: "f82ef8db-31f3-4480-973b-af1f4b6e810e") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:01.823866 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:01.823782 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:01.823866 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:01.823841 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:01.824228 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:01.824201 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:01.824339 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:01.824210 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:01.826721 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:01.824602 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:01.826721 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:01.824732 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:03.516756 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:03.516718 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:03.517204 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:03.516839 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:03.517204 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:03.516928 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret podName:f82ef8db-31f3-4480-973b-af1f4b6e810e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:07.516887617 +0000 UTC m=+20.291320683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret") pod "global-pull-secret-syncer-xdtfx" (UID: "f82ef8db-31f3-4480-973b-af1f4b6e810e") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:03.823005 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:03.822920 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:03.823005 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:03.822968 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:03.823210 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:03.823059 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:03.823210 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:03.823106 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:03.823210 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:03.823161 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:03.823357 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:03.823252 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:05.428671 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:05.428617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:05.429235 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.428803 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:05.429235 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.428878 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.42885814 +0000 UTC m=+34.203291015 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:05.630280 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:05.630245 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:05.630450 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.630428 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:05.630529 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.630455 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:05.630529 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.630467 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x444b for pod openshift-network-diagnostics/network-check-target-5gffj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:05.630619 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.630529 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b podName:388a768c-7e44-4c31-8196-916e3ba70a82 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.63051018 +0000 UTC m=+34.404943038 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x444b" (UniqueName: "kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b") pod "network-check-target-5gffj" (UID: "388a768c-7e44-4c31-8196-916e3ba70a82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:05.823305 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:05.823222 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:05.823305 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:05.823267 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:05.823542 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.823375 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:05.823542 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.823417 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:05.823542 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:05.823509 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:05.823736 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:05.823675 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:07.542118 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:07.542084 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:07.542392 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:07.542191 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:07.542392 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:07.542257 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret podName:f82ef8db-31f3-4480-973b-af1f4b6e810e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:15.542241311 +0000 UTC m=+28.316674167 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret") pod "global-pull-secret-syncer-xdtfx" (UID: "f82ef8db-31f3-4480-973b-af1f4b6e810e") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:07.824214 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:07.824131 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:07.824444 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:07.824223 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:07.824444 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:07.824256 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:07.824444 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:07.824256 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:07.824444 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:07.824326 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:07.824444 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:07.824420 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:08.956273 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.955845 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-65fxs" event={"ID":"0818dbcb-a498-4a49-8ca5-0b677796b068","Type":"ContainerStarted","Data":"d06d512ce3cac7f4c8f886d51ff8a22d2c45285d626fd14a2bc43a429ae80f97"} Apr 24 23:54:08.957316 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.957281 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" event={"ID":"ab1eac84-5f92-48ae-833a-00e1a821cd2e","Type":"ContainerStarted","Data":"ba40a37a5f3a398ca6d04ffef7cfe72edb4cf13eac2f115d372a765a00ff3460"} Apr 24 23:54:08.958604 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.958580 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" event={"ID":"9369f8ba-c07b-4dda-864a-5be415a51468","Type":"ContainerStarted","Data":"2849a3c32596b17a643db0ec0a2682f4abd430ee6fc84df7af9af2b92af20bc0"} Apr 24 23:54:08.959945 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.959899 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-g9jvl" event={"ID":"926c45d5-951c-4d9f-9ec3-07d7ca1a80dc","Type":"ContainerStarted","Data":"c32c190f7eec19c79cac9bb42220736dba0e5616b004458458acd49c54b4897c"} Apr 24 23:54:08.961163 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.961144 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bcdmk" event={"ID":"5bcba16a-2d33-4168-8eae-a6ab55719a08","Type":"ContainerStarted","Data":"cd962d4656a1f92f31587fca5dfe976485d75448740a32b3cbd8fbe45b44ed4c"} Apr 24 23:54:08.962370 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.962347 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-vs22k" event={"ID":"9646a754-da93-4e1f-9571-2b775195390b","Type":"ContainerStarted","Data":"c12c7b3c16a731bbd9d7793a7fac8ac8eef22a662adf27e572ca8f62c8af0fd2"} Apr 24 23:54:08.964745 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.964726 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"e75e5a48d498b1330c75275d5852fc3c2ae4e1bc10b1c9a3c96024100492bef2"} Apr 24 23:54:08.964818 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.964752 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"385552ae0d45b3a8884b5b36a9f3a6c74d2df5ccd3cd3b8225a2583ab7f6eacf"} Apr 24 23:54:08.964818 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.964762 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"90ca6c0724becb3cfe4a8beab3fd98821634ce284c03b6410d9d4fd703cc7b46"} Apr 24 23:54:08.964818 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.964771 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"ae7243e9fbd3e1a1ef6532c0eaa6d17404bb68cbdec27c40c362e7f1f5d34b8c"} Apr 24 23:54:08.964818 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.964779 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"aa2c99ac12087d20c6b5ba20b38f282818cb056315c24922f6977f30f4d605d5"} Apr 24 23:54:08.964818 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.964787 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"11844115aa9c2c23484966812ade45bb2cb6bbc825c95ae430a6835119fc3f3c"} Apr 24 23:54:08.965965 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.965944 2572 generic.go:358] "Generic (PLEG): container finished" podID="c23cdd8c-e99e-473b-acb6-6602cadc65a1" containerID="1193487ec2dcc0dd52be8e99ee15a9173dc13fff8b0beac19298c33d6e57f228" exitCode=0 Apr 24 23:54:08.966043 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.965975 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerDied","Data":"1193487ec2dcc0dd52be8e99ee15a9173dc13fff8b0beac19298c33d6e57f228"} Apr 24 23:54:08.973838 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.973790 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-65fxs" podStartSLOduration=4.369455679 podStartE2EDuration="21.973779562s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.372097411 +0000 UTC m=+3.146530280" lastFinishedPulling="2026-04-24 23:54:07.976421307 +0000 UTC m=+20.750854163" observedRunningTime="2026-04-24 23:54:08.973757645 +0000 UTC m=+21.748190534" watchObservedRunningTime="2026-04-24 23:54:08.973779562 +0000 UTC m=+21.748212469" Apr 24 23:54:08.989541 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:08.989499 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-g9jvl" podStartSLOduration=4.842232189 podStartE2EDuration="21.989484816s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.370026249 +0000 UTC m=+3.144459118" lastFinishedPulling="2026-04-24 23:54:07.517278888 +0000 UTC m=+20.291711745" observedRunningTime="2026-04-24 23:54:08.989337144 +0000 UTC m=+21.763770023" watchObservedRunningTime="2026-04-24 23:54:08.989484816 +0000 UTC m=+21.763917693" Apr 24 23:54:09.009889 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.009842 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tcsxn" podStartSLOduration=3.409033335 podStartE2EDuration="21.009828125s" podCreationTimestamp="2026-04-24 23:53:48 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.370546707 +0000 UTC m=+3.144979577" lastFinishedPulling="2026-04-24 23:54:07.9713415 +0000 UTC m=+20.745774367" observedRunningTime="2026-04-24 23:54:09.009585552 +0000 UTC m=+21.784018454" watchObservedRunningTime="2026-04-24 23:54:09.009828125 +0000 UTC m=+21.784261002" Apr 24 23:54:09.056982 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.056925 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-vs22k" podStartSLOduration=4.500392174 podStartE2EDuration="22.056891921s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.362726515 +0000 UTC m=+3.137159378" lastFinishedPulling="2026-04-24 23:54:07.919226251 +0000 UTC m=+20.693659125" observedRunningTime="2026-04-24 23:54:09.056736365 +0000 UTC m=+21.831169244" watchObservedRunningTime="2026-04-24 23:54:09.056891921 +0000 UTC m=+21.831324800" Apr 24 23:54:09.080057 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.080013 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bcdmk" podStartSLOduration=3.479004832 podStartE2EDuration="21.079998153s" podCreationTimestamp="2026-04-24 23:53:48 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.37035537 +0000 UTC m=+3.144788230" lastFinishedPulling="2026-04-24 23:54:07.971348679 +0000 UTC m=+20.745781551" observedRunningTime="2026-04-24 23:54:09.079712373 +0000 UTC m=+21.854145250" watchObservedRunningTime="2026-04-24 23:54:09.079998153 +0000 UTC m=+21.854431030" Apr 24 23:54:09.109836 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.109810 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:54:09.110794 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.110775 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:54:09.113530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.113499 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 23:54:09.772563 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.772444 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T23:54:09.113517326Z","UUID":"60f6e997-2df1-404e-a30d-d256f7f5429f","Handler":null,"Name":"","Endpoint":""} Apr 24 23:54:09.775508 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.775479 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 23:54:09.775673 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.775517 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 23:54:09.822992 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.822957 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:09.823170 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.822957 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:09.823170 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:09.823083 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:09.823170 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:09.823157 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:09.823337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.823199 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:09.823337 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:09.823264 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:09.970641 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.970593 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" event={"ID":"ab1eac84-5f92-48ae-833a-00e1a821cd2e","Type":"ContainerStarted","Data":"e190f8b09d591b61870059727f4dd2db34792e73afb4cee27c70e2dec18c73cb"} Apr 24 23:54:09.974072 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.974038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-87xm9" event={"ID":"79016484-eae2-4542-8926-e0955b9dfe90","Type":"ContainerStarted","Data":"0ff3f404b2e6fab41445077335d828ac2df2905b403b25caa9d7b0c3f3eafd86"} Apr 24 23:54:09.974875 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.974850 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:54:09.975412 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.975392 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-g9jvl" Apr 24 23:54:09.991113 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:09.990650 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-87xm9" podStartSLOduration=5.377266508 podStartE2EDuration="22.990634471s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.362053739 +0000 UTC m=+3.136486595" lastFinishedPulling="2026-04-24 23:54:07.975421685 +0000 UTC m=+20.749854558" observedRunningTime="2026-04-24 23:54:09.990170418 +0000 UTC m=+22.764603325" watchObservedRunningTime="2026-04-24 23:54:09.990634471 +0000 UTC m=+22.765067349" Apr 24 23:54:10.976992 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:10.976952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" event={"ID":"ab1eac84-5f92-48ae-833a-00e1a821cd2e","Type":"ContainerStarted","Data":"9b3f44264897f33a5080a8b8b1c3c3b9fa5fa18e0c86d2cf93fba9cd0b96aef0"} Apr 24 23:54:10.980531 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:10.980497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"2380c8270e9698f1116745faff64b75c2b88be5903a4fdbc42b8941afa9e023d"} Apr 24 23:54:10.997159 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:10.997106 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g4pzq" podStartSLOduration=4.453266208 podStartE2EDuration="23.997088728s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.366691578 +0000 UTC m=+3.141124440" lastFinishedPulling="2026-04-24 23:54:09.91051409 +0000 UTC m=+22.684946960" observedRunningTime="2026-04-24 23:54:10.996544911 +0000 UTC m=+23.770977789" watchObservedRunningTime="2026-04-24 23:54:10.997088728 +0000 UTC m=+23.771521608" Apr 24 23:54:11.826715 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:11.826680 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:11.826919 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:11.826725 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:11.826919 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:11.826688 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:11.826919 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:11.826809 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:11.827076 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:11.826921 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:11.827076 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:11.826983 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:13.823765 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.823434 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:13.824693 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.823488 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:13.824693 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:13.823837 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:13.824693 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:13.823951 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:13.824693 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.823507 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:13.824693 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:13.824034 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:13.990082 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.990040 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" event={"ID":"bbd086d5-cca3-4b01-aa4c-f76f49619285","Type":"ContainerStarted","Data":"f6c51a3eb1dcd87256b73f020a4b62b4b276778f0a766e41a7af6ee9d4967a5b"} Apr 24 23:54:13.990412 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.990395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:54:13.990501 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.990421 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:54:13.991742 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.991715 2572 generic.go:358] "Generic (PLEG): container finished" podID="c23cdd8c-e99e-473b-acb6-6602cadc65a1" containerID="85dbfbc01405f06d12483fb02bb699f3ec6b6b6731ffbbd891bf13bd67d014e0" exitCode=0 Apr 24 23:54:13.991861 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:13.991755 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerDied","Data":"85dbfbc01405f06d12483fb02bb699f3ec6b6b6731ffbbd891bf13bd67d014e0"} Apr 24 23:54:14.005576 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.005553 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:54:14.019443 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.019403 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" podStartSLOduration=9.105214521 podStartE2EDuration="27.019391491s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.361015602 +0000 UTC m=+3.135448464" lastFinishedPulling="2026-04-24 23:54:08.275192575 +0000 UTC m=+21.049625434" observedRunningTime="2026-04-24 23:54:14.019004092 +0000 UTC m=+26.793436969" watchObservedRunningTime="2026-04-24 23:54:14.019391491 +0000 UTC m=+26.793824392" Apr 24 23:54:14.923094 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.923011 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xgtzt"] Apr 24 23:54:14.923423 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.923142 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:14.923423 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:14.923269 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:14.926758 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.926732 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xdtfx"] Apr 24 23:54:14.926881 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.926821 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:14.926953 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:14.926936 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:14.927462 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.927435 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5gffj"] Apr 24 23:54:14.927600 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.927521 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:14.927657 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:14.927602 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:14.995242 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.995204 2572 generic.go:358] "Generic (PLEG): container finished" podID="c23cdd8c-e99e-473b-acb6-6602cadc65a1" containerID="98fb0ead741ba092e5ce9b2c3b109060f2dac62373332a4356a4064cd085d35f" exitCode=0 Apr 24 23:54:14.995388 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.995284 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerDied","Data":"98fb0ead741ba092e5ce9b2c3b109060f2dac62373332a4356a4064cd085d35f"} Apr 24 23:54:14.996476 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:14.995873 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:54:15.011858 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:15.011831 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:54:15.598839 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:15.598802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:15.599026 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:15.598930 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:15.599092 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:15.599032 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret podName:f82ef8db-31f3-4480-973b-af1f4b6e810e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:31.599014245 +0000 UTC m=+44.373447106 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret") pod "global-pull-secret-syncer-xdtfx" (UID: "f82ef8db-31f3-4480-973b-af1f4b6e810e") : object "kube-system"/"original-pull-secret" not registered Apr 24 23:54:15.998719 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:15.998685 2572 generic.go:358] "Generic (PLEG): container finished" podID="c23cdd8c-e99e-473b-acb6-6602cadc65a1" containerID="1a952f0cb119d9180ceb8a0b7ffabf04d77724bb6a789bc09a143b7f80fc9489" exitCode=0 Apr 24 23:54:15.999108 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:15.998797 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerDied","Data":"1a952f0cb119d9180ceb8a0b7ffabf04d77724bb6a789bc09a143b7f80fc9489"} Apr 24 23:54:16.823955 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:16.823925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:16.824141 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:16.823925 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:16.824141 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:16.824037 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:16.824141 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:16.824129 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:16.824265 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:16.823932 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:16.824265 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:16.824245 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:18.823508 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:18.823299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:18.823947 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:18.823299 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:18.823947 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:18.823300 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:18.823947 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:18.823649 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:18.823947 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:18.823735 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:18.823947 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:18.823798 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:20.823996 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:20.823901 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:20.824553 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:20.823901 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:20.824553 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:20.824054 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-xdtfx" podUID="f82ef8db-31f3-4480-973b-af1f4b6e810e" Apr 24 23:54:20.824553 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:20.824150 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:54:20.824553 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:20.823901 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:20.824553 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:20.824264 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5gffj" podUID="388a768c-7e44-4c31-8196-916e3ba70a82" Apr 24 23:54:21.089337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.089250 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-5.ec2.internal" event="NodeReady" Apr 24 23:54:21.089555 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.089407 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 23:54:21.137887 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.137853 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wrrx5"] Apr 24 23:54:21.167748 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.167710 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bm675"] Apr 24 23:54:21.167930 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.167826 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.170730 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.170593 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 23:54:21.170730 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.170610 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 23:54:21.171378 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.171353 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ktrpk\"" Apr 24 23:54:21.189259 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.189226 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bm675"] Apr 24 23:54:21.189259 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.189258 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wrrx5"] Apr 24 23:54:21.189463 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.189365 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:21.191938 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.191898 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-h7snm\"" Apr 24 23:54:21.192106 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.192085 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 23:54:21.192317 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.192290 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 23:54:21.192317 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.192307 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 23:54:21.240359 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.240320 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32f0c013-e25d-4e15-bbfa-6824bd7f131e-config-volume\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.240545 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.240388 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32f0c013-e25d-4e15-bbfa-6824bd7f131e-tmp-dir\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.240545 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.240420 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjx86\" (UniqueName: \"kubernetes.io/projected/32f0c013-e25d-4e15-bbfa-6824bd7f131e-kube-api-access-zjx86\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.240545 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.240530 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.341387 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341303 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32f0c013-e25d-4e15-bbfa-6824bd7f131e-config-volume\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.341387 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341377 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32f0c013-e25d-4e15-bbfa-6824bd7f131e-tmp-dir\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.341640 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjx86\" (UniqueName: \"kubernetes.io/projected/32f0c013-e25d-4e15-bbfa-6824bd7f131e-kube-api-access-zjx86\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.341640 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341433 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:21.341640 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341471 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cwcc\" (UniqueName: \"kubernetes.io/projected/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-kube-api-access-7cwcc\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:21.341640 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341526 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.341640 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.341617 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:21.341868 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.341679 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.841658303 +0000 UTC m=+34.616091179 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:54:21.341868 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341824 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/32f0c013-e25d-4e15-bbfa-6824bd7f131e-tmp-dir\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.341992 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.341961 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/32f0c013-e25d-4e15-bbfa-6824bd7f131e-config-volume\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.365176 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.365146 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjx86\" (UniqueName: \"kubernetes.io/projected/32f0c013-e25d-4e15-bbfa-6824bd7f131e-kube-api-access-zjx86\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.442450 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.442414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:21.442450 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.442463 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cwcc\" (UniqueName: \"kubernetes.io/projected/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-kube-api-access-7cwcc\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:21.442731 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.442494 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:21.442731 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.442579 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:21.442731 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.442590 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:21.442731 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.442640 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:53.442625719 +0000 UTC m=+66.217058575 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 23:54:21.442731 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.442653 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:21.942646537 +0000 UTC m=+34.717079393 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:54:21.451017 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.450983 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cwcc\" (UniqueName: \"kubernetes.io/projected/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-kube-api-access-7cwcc\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:21.644084 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.643990 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:21.644264 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.644117 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 23:54:21.644264 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.644131 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 23:54:21.644264 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.644140 2572 projected.go:194] Error preparing data for projected volume kube-api-access-x444b for pod openshift-network-diagnostics/network-check-target-5gffj: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:21.644264 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.644189 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b podName:388a768c-7e44-4c31-8196-916e3ba70a82 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:53.64417653 +0000 UTC m=+66.418609385 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x444b" (UniqueName: "kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b") pod "network-check-target-5gffj" (UID: "388a768c-7e44-4c31-8196-916e3ba70a82") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 23:54:21.845531 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.845496 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:21.846179 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.845647 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:21.846179 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.845723 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.845701709 +0000 UTC m=+35.620134575 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:54:21.946885 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:21.946846 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:21.947070 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.946981 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:21.947070 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:21.947042 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:22.94702389 +0000 UTC m=+35.721456747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:54:22.013523 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.013489 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerStarted","Data":"39b3473954731c23707bde4367d76b6030026322e53b5dfa451b99bb56a34634"} Apr 24 23:54:22.823067 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.823028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:22.823248 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.823028 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:22.823248 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.823029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:22.826993 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.826032 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:22.826993 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.826047 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wfsx7\"" Apr 24 23:54:22.827248 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.827230 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:22.827547 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.827242 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 23:54:22.827618 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.827601 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jk9j5\"" Apr 24 23:54:22.827669 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.827606 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:22.854494 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.854461 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:22.854899 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:22.854601 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:22.854899 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:22.854663 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:24.854646355 +0000 UTC m=+37.629079211 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:54:22.955861 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:22.955818 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:22.956050 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:22.956006 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:22.956105 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:22.956075 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:24.95605627 +0000 UTC m=+37.730489127 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:54:23.018194 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.018156 2572 generic.go:358] "Generic (PLEG): container finished" podID="c23cdd8c-e99e-473b-acb6-6602cadc65a1" containerID="39b3473954731c23707bde4367d76b6030026322e53b5dfa451b99bb56a34634" exitCode=0 Apr 24 23:54:23.018329 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.018218 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerDied","Data":"39b3473954731c23707bde4367d76b6030026322e53b5dfa451b99bb56a34634"} Apr 24 23:54:23.535058 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.535028 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6"] Apr 24 23:54:23.546114 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.546084 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6"] Apr 24 23:54:23.546243 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.546210 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.548794 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.548764 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 23:54:23.548951 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.548832 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 23:54:23.548951 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.548847 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 23:54:23.548951 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.548928 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-v67vk\"" Apr 24 23:54:23.548951 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.548934 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 23:54:23.581666 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.581633 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l"] Apr 24 23:54:23.595097 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.595061 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.595812 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.595787 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l"] Apr 24 23:54:23.597260 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.597236 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 23:54:23.597260 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.597248 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 23:54:23.597434 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.597312 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 23:54:23.597434 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.597250 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 23:54:23.662124 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dvmm\" (UniqueName: \"kubernetes.io/projected/72ce884f-e894-4a64-bdf6-68e0e9458ac7-kube-api-access-7dvmm\") pod \"managed-serviceaccount-addon-agent-68856678df-ldsk6\" (UID: \"72ce884f-e894-4a64-bdf6-68e0e9458ac7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.662257 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662143 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.662257 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662192 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.662257 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662219 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-ca\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.662257 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-hub\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.662407 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662270 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/72ce884f-e894-4a64-bdf6-68e0e9458ac7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68856678df-ldsk6\" (UID: \"72ce884f-e894-4a64-bdf6-68e0e9458ac7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.662407 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662307 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z22wk\" (UniqueName: \"kubernetes.io/projected/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-kube-api-access-z22wk\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.662407 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.662382 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.763054 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763024 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.763208 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763058 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-ca\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.763208 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-hub\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.763208 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763133 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/72ce884f-e894-4a64-bdf6-68e0e9458ac7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68856678df-ldsk6\" (UID: \"72ce884f-e894-4a64-bdf6-68e0e9458ac7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.763208 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z22wk\" (UniqueName: \"kubernetes.io/projected/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-kube-api-access-z22wk\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.763350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763210 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.763350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763307 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7dvmm\" (UniqueName: \"kubernetes.io/projected/72ce884f-e894-4a64-bdf6-68e0e9458ac7-kube-api-access-7dvmm\") pod \"managed-serviceaccount-addon-agent-68856678df-ldsk6\" (UID: \"72ce884f-e894-4a64-bdf6-68e0e9458ac7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.763350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.763329 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.765705 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.765682 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/72ce884f-e894-4a64-bdf6-68e0e9458ac7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-68856678df-ldsk6\" (UID: \"72ce884f-e894-4a64-bdf6-68e0e9458ac7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.771602 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.771577 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-ca\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.771771 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.771749 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.773444 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.773425 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-hub\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.773688 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.773667 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.773729 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.773705 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.777845 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.777820 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z22wk\" (UniqueName: \"kubernetes.io/projected/61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9-kube-api-access-z22wk\") pod \"cluster-proxy-proxy-agent-5c44f9d677-qcs8l\" (UID: \"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:23.778746 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.778729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dvmm\" (UniqueName: \"kubernetes.io/projected/72ce884f-e894-4a64-bdf6-68e0e9458ac7-kube-api-access-7dvmm\") pod \"managed-serviceaccount-addon-agent-68856678df-ldsk6\" (UID: \"72ce884f-e894-4a64-bdf6-68e0e9458ac7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.868209 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.868139 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" Apr 24 23:54:23.905257 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:23.905226 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:54:24.023409 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:24.023379 2572 generic.go:358] "Generic (PLEG): container finished" podID="c23cdd8c-e99e-473b-acb6-6602cadc65a1" containerID="366994f1d091685a798d2577e35da1da1a7db63996739a1353a572f1b89b576c" exitCode=0 Apr 24 23:54:24.023555 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:24.023426 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerDied","Data":"366994f1d091685a798d2577e35da1da1a7db63996739a1353a572f1b89b576c"} Apr 24 23:54:24.028159 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:24.028131 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6"] Apr 24 23:54:24.032609 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:54:24.032585 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72ce884f_e894_4a64_bdf6_68e0e9458ac7.slice/crio-b4606e8c0ed9a7325a922bc23fe11ef75e76b2f948f747ee5d32a269f608b284 WatchSource:0}: Error finding container b4606e8c0ed9a7325a922bc23fe11ef75e76b2f948f747ee5d32a269f608b284: Status 404 returned error can't find the container with id b4606e8c0ed9a7325a922bc23fe11ef75e76b2f948f747ee5d32a269f608b284 Apr 24 23:54:24.044585 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:24.044561 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l"] Apr 24 23:54:24.872384 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:24.872343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:24.872759 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:24.872495 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:24.872759 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:24.872562 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:28.872545199 +0000 UTC m=+41.646978059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:54:24.973707 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:24.973671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:24.973869 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:24.973845 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:24.973951 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:24.973940 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:28.973922336 +0000 UTC m=+41.748355209 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:54:25.029450 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:25.029349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qs259" event={"ID":"c23cdd8c-e99e-473b-acb6-6602cadc65a1","Type":"ContainerStarted","Data":"95e567e5b672494613275f61e351bc9cd23b93fdcda3440c9a259217b62a864f"} Apr 24 23:54:25.030693 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:25.030661 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" event={"ID":"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9","Type":"ContainerStarted","Data":"33a643da56121bc7e299eaf29ece09136c4ed120b6ce38be19b0f1302e144200"} Apr 24 23:54:25.031741 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:25.031717 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" event={"ID":"72ce884f-e894-4a64-bdf6-68e0e9458ac7","Type":"ContainerStarted","Data":"b4606e8c0ed9a7325a922bc23fe11ef75e76b2f948f747ee5d32a269f608b284"} Apr 24 23:54:25.052956 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:25.052877 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qs259" podStartSLOduration=6.614066518 podStartE2EDuration="38.052861127s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:53:50.358872255 +0000 UTC m=+3.133305111" lastFinishedPulling="2026-04-24 23:54:21.797666865 +0000 UTC m=+34.572099720" observedRunningTime="2026-04-24 23:54:25.051715403 +0000 UTC m=+37.826148284" watchObservedRunningTime="2026-04-24 23:54:25.052861127 +0000 UTC m=+37.827294005" Apr 24 23:54:28.901966 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:28.901918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:28.902421 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:28.902109 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:28.902421 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:28.902195 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:36.90217325 +0000 UTC m=+49.676606109 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:54:29.003034 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:29.002994 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:29.003209 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:29.003106 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:29.003209 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:29.003186 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:37.003166446 +0000 UTC m=+49.777599303 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:54:31.625257 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:31.625218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:31.629023 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:31.629001 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f82ef8db-31f3-4480-973b-af1f4b6e810e-original-pull-secret\") pod \"global-pull-secret-syncer-xdtfx\" (UID: \"f82ef8db-31f3-4480-973b-af1f4b6e810e\") " pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:31.839690 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:31.839661 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xdtfx" Apr 24 23:54:31.966483 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:31.966438 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xdtfx"] Apr 24 23:54:31.969974 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:54:31.969937 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82ef8db_31f3_4480_973b_af1f4b6e810e.slice/crio-6498a7fb5c200df7468ca210c921cd00653add670b6708b2b7b6fc2c15dcb04d WatchSource:0}: Error finding container 6498a7fb5c200df7468ca210c921cd00653add670b6708b2b7b6fc2c15dcb04d: Status 404 returned error can't find the container with id 6498a7fb5c200df7468ca210c921cd00653add670b6708b2b7b6fc2c15dcb04d Apr 24 23:54:32.046918 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:32.046885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" event={"ID":"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9","Type":"ContainerStarted","Data":"9766eb2d98d02212ed16a51a96d68c226094fd4943c469d2bad35b052b798b4e"} Apr 24 23:54:32.051173 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:32.051135 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" event={"ID":"72ce884f-e894-4a64-bdf6-68e0e9458ac7","Type":"ContainerStarted","Data":"b3fe3f1eba80944373717b98e47b1463f9d577f16103a9d3acc9ea0936e18943"} Apr 24 23:54:32.052159 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:32.052133 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xdtfx" event={"ID":"f82ef8db-31f3-4480-973b-af1f4b6e810e","Type":"ContainerStarted","Data":"6498a7fb5c200df7468ca210c921cd00653add670b6708b2b7b6fc2c15dcb04d"} Apr 24 23:54:32.067292 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:32.067234 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-68856678df-ldsk6" podStartSLOduration=1.9195929550000002 podStartE2EDuration="9.067221992s" podCreationTimestamp="2026-04-24 23:54:23 +0000 UTC" firstStartedPulling="2026-04-24 23:54:24.034402571 +0000 UTC m=+36.808835431" lastFinishedPulling="2026-04-24 23:54:31.182031613 +0000 UTC m=+43.956464468" observedRunningTime="2026-04-24 23:54:32.066480522 +0000 UTC m=+44.840913400" watchObservedRunningTime="2026-04-24 23:54:32.067221992 +0000 UTC m=+44.841654870" Apr 24 23:54:36.967088 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:36.967047 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:36.967529 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:36.967195 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:36.967529 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:36.967268 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:54:52.967251951 +0000 UTC m=+65.741684811 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:54:37.064166 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:37.064131 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" event={"ID":"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9","Type":"ContainerStarted","Data":"952eaa8d6b13b09e72e3cf89687621592c10b1e6453de89a1710af18749fd06d"} Apr 24 23:54:37.064308 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:37.064170 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" event={"ID":"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9","Type":"ContainerStarted","Data":"3117998aa6b60c4727dcf276b9a85e56441220a258b9427a938f1ab35006512b"} Apr 24 23:54:37.065432 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:37.065405 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xdtfx" event={"ID":"f82ef8db-31f3-4480-973b-af1f4b6e810e","Type":"ContainerStarted","Data":"e416f3d5024e777efec1676c07c13371e31def24c3e7ea196fbb966ad0a8a462"} Apr 24 23:54:37.067861 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:37.067842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:37.068002 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:37.067987 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:37.068049 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:37.068041 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:54:53.06802591 +0000 UTC m=+65.842458765 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:54:37.083265 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:37.083216 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" podStartSLOduration=2.090922834 podStartE2EDuration="14.083201627s" podCreationTimestamp="2026-04-24 23:54:23 +0000 UTC" firstStartedPulling="2026-04-24 23:54:24.049345035 +0000 UTC m=+36.823777897" lastFinishedPulling="2026-04-24 23:54:36.041623829 +0000 UTC m=+48.816056690" observedRunningTime="2026-04-24 23:54:37.082247647 +0000 UTC m=+49.856680529" watchObservedRunningTime="2026-04-24 23:54:37.083201627 +0000 UTC m=+49.857634527" Apr 24 23:54:37.098389 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:37.098344 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xdtfx" podStartSLOduration=34.017110885 podStartE2EDuration="38.098330131s" podCreationTimestamp="2026-04-24 23:53:59 +0000 UTC" firstStartedPulling="2026-04-24 23:54:31.971593324 +0000 UTC m=+44.746026180" lastFinishedPulling="2026-04-24 23:54:36.052812571 +0000 UTC m=+48.827245426" observedRunningTime="2026-04-24 23:54:37.097049638 +0000 UTC m=+49.871482516" watchObservedRunningTime="2026-04-24 23:54:37.098330131 +0000 UTC m=+49.872762988" Apr 24 23:54:47.011607 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:47.011579 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rpfg" Apr 24 23:54:52.988225 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:52.988180 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:54:52.988717 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:52.988341 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:54:52.988717 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:52.988424 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:55:24.988407091 +0000 UTC m=+97.762839947 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:54:53.088745 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.088706 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:54:53.088922 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:53.088819 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:54:53.088922 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:53.088871 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:25.088857553 +0000 UTC m=+97.863290409 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:54:53.491330 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.491296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:54:53.493422 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.493404 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 23:54:53.502089 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:53.502066 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:54:53.502163 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:54:53.502153 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:55:57.502135181 +0000 UTC m=+130.276568037 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : secret "metrics-daemon-secret" not found Apr 24 23:54:53.693093 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.693051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:53.695622 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.695598 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 23:54:53.705039 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.705020 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 23:54:53.716009 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.715981 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x444b\" (UniqueName: \"kubernetes.io/projected/388a768c-7e44-4c31-8196-916e3ba70a82-kube-api-access-x444b\") pod \"network-check-target-5gffj\" (UID: \"388a768c-7e44-4c31-8196-916e3ba70a82\") " pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:53.735789 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.735762 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-wfsx7\"" Apr 24 23:54:53.744667 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.744616 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:53.858443 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:53.858413 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5gffj"] Apr 24 23:54:53.861565 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:54:53.861534 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod388a768c_7e44_4c31_8196_916e3ba70a82.slice/crio-333e4e62a8cd0707f5e84b8cff784c30d2fe31082784632b0a1b87fecef84394 WatchSource:0}: Error finding container 333e4e62a8cd0707f5e84b8cff784c30d2fe31082784632b0a1b87fecef84394: Status 404 returned error can't find the container with id 333e4e62a8cd0707f5e84b8cff784c30d2fe31082784632b0a1b87fecef84394 Apr 24 23:54:54.099525 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:54.099438 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5gffj" event={"ID":"388a768c-7e44-4c31-8196-916e3ba70a82","Type":"ContainerStarted","Data":"333e4e62a8cd0707f5e84b8cff784c30d2fe31082784632b0a1b87fecef84394"} Apr 24 23:54:57.107163 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:57.107125 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5gffj" event={"ID":"388a768c-7e44-4c31-8196-916e3ba70a82","Type":"ContainerStarted","Data":"1fd9c47720b3a33b30f8eae14f2dff68b2f098226f953440ece37433f29bf911"} Apr 24 23:54:57.107545 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:57.107251 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:54:57.121923 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:54:57.121862 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5gffj" podStartSLOduration=66.425485032 podStartE2EDuration="1m9.12184759s" podCreationTimestamp="2026-04-24 23:53:48 +0000 UTC" firstStartedPulling="2026-04-24 23:54:53.863471877 +0000 UTC m=+66.637904733" lastFinishedPulling="2026-04-24 23:54:56.559834422 +0000 UTC m=+69.334267291" observedRunningTime="2026-04-24 23:54:57.121612405 +0000 UTC m=+69.896045281" watchObservedRunningTime="2026-04-24 23:54:57.12184759 +0000 UTC m=+69.896280468" Apr 24 23:55:25.020309 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:55:25.020269 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:55:25.020688 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:55:25.020372 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 23:55:25.020688 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:55:25.020442 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls podName:32f0c013-e25d-4e15-bbfa-6824bd7f131e nodeName:}" failed. No retries permitted until 2026-04-24 23:56:29.020426168 +0000 UTC m=+161.794859025 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls") pod "dns-default-wrrx5" (UID: "32f0c013-e25d-4e15-bbfa-6824bd7f131e") : secret "dns-default-metrics-tls" not found Apr 24 23:55:25.120759 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:55:25.120724 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:55:25.120903 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:55:25.120881 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 23:55:25.120973 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:55:25.120965 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert podName:0d0287de-4bd1-4d95-adbc-1ee225e3d1b2 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:29.120949608 +0000 UTC m=+161.895382464 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert") pod "ingress-canary-bm675" (UID: "0d0287de-4bd1-4d95-adbc-1ee225e3d1b2") : secret "canary-serving-cert" not found Apr 24 23:55:28.112455 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:55:28.112417 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5gffj" Apr 24 23:55:53.851460 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:55:53.851428 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bcdmk_5bcba16a-2d33-4168-8eae-a6ab55719a08/dns-node-resolver/0.log" Apr 24 23:55:55.251128 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:55:55.251099 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vs22k_9646a754-da93-4e1f-9571-2b775195390b/node-ca/0.log" Apr 24 23:55:57.554266 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:55:57.554209 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:55:57.554663 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:55:57.554361 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 23:55:57.554663 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:55:57.554458 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs podName:6ce242e4-92d1-4ff1-8276-05d4293cfb10 nodeName:}" failed. No retries permitted until 2026-04-24 23:57:59.554441444 +0000 UTC m=+252.328874301 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs") pod "network-metrics-daemon-xgtzt" (UID: "6ce242e4-92d1-4ff1-8276-05d4293cfb10") : secret "metrics-daemon-secret" not found Apr 24 23:56:16.641702 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.641662 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5f4446d468-6llfg"] Apr 24 23:56:16.644463 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.644447 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.646767 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.646738 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-f8xnt\"" Apr 24 23:56:16.646891 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.646812 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 23:56:16.646891 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.646831 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 23:56:16.647384 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.647368 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 23:56:16.656299 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.656275 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 23:56:16.661061 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.661035 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jm94p"] Apr 24 23:56:16.664182 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.664164 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f4446d468-6llfg"] Apr 24 23:56:16.664309 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.664297 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.667573 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.667550 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 23:56:16.667702 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.667619 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 23:56:16.667702 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.667555 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 23:56:16.667702 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.667663 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 23:56:16.671358 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.671343 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-rp52n\"" Apr 24 23:56:16.679231 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.679211 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jm94p"] Apr 24 23:56:16.802141 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-bound-sa-token\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802180 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-registry-certificates\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62mgk\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-kube-api-access-62mgk\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802228 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/281c00f8-3ed3-4c70-b7ef-54aae75e6114-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.802343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802254 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-trusted-ca\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802516 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-image-registry-private-configuration\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802516 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802369 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-registry-tls\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802516 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802394 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-ca-trust-extracted\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802516 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802474 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-installation-pull-secrets\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.802516 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802509 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/281c00f8-3ed3-4c70-b7ef-54aae75e6114-crio-socket\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.802717 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802532 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdc9c\" (UniqueName: \"kubernetes.io/projected/281c00f8-3ed3-4c70-b7ef-54aae75e6114-kube-api-access-pdc9c\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.802717 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802551 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/281c00f8-3ed3-4c70-b7ef-54aae75e6114-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.802717 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.802596 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/281c00f8-3ed3-4c70-b7ef-54aae75e6114-data-volume\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.903226 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903139 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdc9c\" (UniqueName: \"kubernetes.io/projected/281c00f8-3ed3-4c70-b7ef-54aae75e6114-kube-api-access-pdc9c\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.903226 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903177 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/281c00f8-3ed3-4c70-b7ef-54aae75e6114-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.903226 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903202 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/281c00f8-3ed3-4c70-b7ef-54aae75e6114-data-volume\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903235 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-bound-sa-token\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903273 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-registry-certificates\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903293 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-62mgk\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-kube-api-access-62mgk\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903317 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/281c00f8-3ed3-4c70-b7ef-54aae75e6114-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-trusted-ca\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903399 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-image-registry-private-configuration\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903424 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-registry-tls\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903448 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-ca-trust-extracted\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903819 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-installation-pull-secrets\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.903819 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/281c00f8-3ed3-4c70-b7ef-54aae75e6114-crio-socket\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.903819 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903640 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/281c00f8-3ed3-4c70-b7ef-54aae75e6114-crio-socket\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.903819 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.903782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/281c00f8-3ed3-4c70-b7ef-54aae75e6114-data-volume\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.904335 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.904310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-registry-certificates\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.904515 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.904319 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-ca-trust-extracted\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.904515 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.904398 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-trusted-ca\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.904697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.904532 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/281c00f8-3ed3-4c70-b7ef-54aae75e6114-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.905799 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.905770 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/281c00f8-3ed3-4c70-b7ef-54aae75e6114-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.905949 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.905930 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-image-registry-private-configuration\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.906011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.905987 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-installation-pull-secrets\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.906282 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.906262 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-registry-tls\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.913450 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.913417 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdc9c\" (UniqueName: \"kubernetes.io/projected/281c00f8-3ed3-4c70-b7ef-54aae75e6114-kube-api-access-pdc9c\") pod \"insights-runtime-extractor-jm94p\" (UID: \"281c00f8-3ed3-4c70-b7ef-54aae75e6114\") " pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:16.913833 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.913812 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-62mgk\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-kube-api-access-62mgk\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.914021 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.914005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b9bdf6ba-b3c3-426c-8d35-5bd067e86d46-bound-sa-token\") pod \"image-registry-5f4446d468-6llfg\" (UID: \"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46\") " pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.956057 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.956020 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:16.973127 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:16.973098 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jm94p" Apr 24 23:56:17.123162 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.123131 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5f4446d468-6llfg"] Apr 24 23:56:17.123924 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.123878 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jm94p"] Apr 24 23:56:17.126877 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:56:17.126848 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bdf6ba_b3c3_426c_8d35_5bd067e86d46.slice/crio-e3b92e88e15cf3c1bef0089972f2f6de560d8c54266a38c1e320d573b7cbc7ff WatchSource:0}: Error finding container e3b92e88e15cf3c1bef0089972f2f6de560d8c54266a38c1e320d573b7cbc7ff: Status 404 returned error can't find the container with id e3b92e88e15cf3c1bef0089972f2f6de560d8c54266a38c1e320d573b7cbc7ff Apr 24 23:56:17.127408 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:56:17.127386 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod281c00f8_3ed3_4c70_b7ef_54aae75e6114.slice/crio-71d5bf4c4de332b29bc2d891c409c9cbf061fdaa00be186a0799b6702bd12329 WatchSource:0}: Error finding container 71d5bf4c4de332b29bc2d891c409c9cbf061fdaa00be186a0799b6702bd12329: Status 404 returned error can't find the container with id 71d5bf4c4de332b29bc2d891c409c9cbf061fdaa00be186a0799b6702bd12329 Apr 24 23:56:17.296928 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.296868 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jm94p" event={"ID":"281c00f8-3ed3-4c70-b7ef-54aae75e6114","Type":"ContainerStarted","Data":"f9cc8c6f40eede606b7622e8e720f82bfcecf084bc3160600fa53e52f68c5ccf"} Apr 24 23:56:17.297091 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.296931 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jm94p" event={"ID":"281c00f8-3ed3-4c70-b7ef-54aae75e6114","Type":"ContainerStarted","Data":"71d5bf4c4de332b29bc2d891c409c9cbf061fdaa00be186a0799b6702bd12329"} Apr 24 23:56:17.298143 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.298119 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f4446d468-6llfg" event={"ID":"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46","Type":"ContainerStarted","Data":"31ffe62c1f0642fcc3a42a5014f7b609bc00b5789fb5dd2cdaf5f4aa9feeebe0"} Apr 24 23:56:17.298143 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.298147 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5f4446d468-6llfg" event={"ID":"b9bdf6ba-b3c3-426c-8d35-5bd067e86d46","Type":"ContainerStarted","Data":"e3b92e88e15cf3c1bef0089972f2f6de560d8c54266a38c1e320d573b7cbc7ff"} Apr 24 23:56:17.298302 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.298290 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:17.316118 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:17.316067 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5f4446d468-6llfg" podStartSLOduration=1.3160508229999999 podStartE2EDuration="1.316050823s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:56:17.315324373 +0000 UTC m=+150.089757263" watchObservedRunningTime="2026-04-24 23:56:17.316050823 +0000 UTC m=+150.090483700" Apr 24 23:56:18.302796 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:18.302751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jm94p" event={"ID":"281c00f8-3ed3-4c70-b7ef-54aae75e6114","Type":"ContainerStarted","Data":"4179ad4c2797db9b928fe3029be205302e7d07f8ca2d59c1ef4414638e9da171"} Apr 24 23:56:19.306476 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:19.306441 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jm94p" event={"ID":"281c00f8-3ed3-4c70-b7ef-54aae75e6114","Type":"ContainerStarted","Data":"d29ccdd92c338e6ec9c85e1a1797cf8d64b651d95cd21133bf13b76591a473ba"} Apr 24 23:56:19.325755 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:19.325644 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jm94p" podStartSLOduration=1.434900518 podStartE2EDuration="3.325628298s" podCreationTimestamp="2026-04-24 23:56:16 +0000 UTC" firstStartedPulling="2026-04-24 23:56:17.185276423 +0000 UTC m=+149.959709285" lastFinishedPulling="2026-04-24 23:56:19.076004194 +0000 UTC m=+151.850437065" observedRunningTime="2026-04-24 23:56:19.324273102 +0000 UTC m=+152.098705980" watchObservedRunningTime="2026-04-24 23:56:19.325628298 +0000 UTC m=+152.100061177" Apr 24 23:56:24.180858 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:56:24.180804 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wrrx5" podUID="32f0c013-e25d-4e15-bbfa-6824bd7f131e" Apr 24 23:56:24.200019 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:56:24.199981 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-bm675" podUID="0d0287de-4bd1-4d95-adbc-1ee225e3d1b2" Apr 24 23:56:24.319006 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:24.318974 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wrrx5" Apr 24 23:56:25.843980 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:56:25.843939 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-xgtzt" podUID="6ce242e4-92d1-4ff1-8276-05d4293cfb10" Apr 24 23:56:29.094194 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.094141 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:56:29.096520 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.096490 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/32f0c013-e25d-4e15-bbfa-6824bd7f131e-metrics-tls\") pod \"dns-default-wrrx5\" (UID: \"32f0c013-e25d-4e15-bbfa-6824bd7f131e\") " pod="openshift-dns/dns-default-wrrx5" Apr 24 23:56:29.122247 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.122216 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-ktrpk\"" Apr 24 23:56:29.130210 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.130190 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wrrx5" Apr 24 23:56:29.195437 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.195406 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:56:29.198200 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.198176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d0287de-4bd1-4d95-adbc-1ee225e3d1b2-cert\") pod \"ingress-canary-bm675\" (UID: \"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2\") " pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:56:29.247273 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.247226 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wrrx5"] Apr 24 23:56:29.249802 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:56:29.249775 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32f0c013_e25d_4e15_bbfa_6824bd7f131e.slice/crio-1740ee4c1df9cce51f721f726fd96bec4e56c9c7e49f7d29f2beac159eeccd5a WatchSource:0}: Error finding container 1740ee4c1df9cce51f721f726fd96bec4e56c9c7e49f7d29f2beac159eeccd5a: Status 404 returned error can't find the container with id 1740ee4c1df9cce51f721f726fd96bec4e56c9c7e49f7d29f2beac159eeccd5a Apr 24 23:56:29.331490 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:29.331455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wrrx5" event={"ID":"32f0c013-e25d-4e15-bbfa-6824bd7f131e","Type":"ContainerStarted","Data":"1740ee4c1df9cce51f721f726fd96bec4e56c9c7e49f7d29f2beac159eeccd5a"} Apr 24 23:56:31.340830 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:31.340784 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wrrx5" event={"ID":"32f0c013-e25d-4e15-bbfa-6824bd7f131e","Type":"ContainerStarted","Data":"15f90fc02e47351e3559bb205068e0d166f5fdef6e1ecfa06d2482e1ac95c8a3"} Apr 24 23:56:31.340830 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:31.340834 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wrrx5" event={"ID":"32f0c013-e25d-4e15-bbfa-6824bd7f131e","Type":"ContainerStarted","Data":"c956e4c3b9c56c6166181a2dfece0679987b5acc55702733dec91f50e53fa64d"} Apr 24 23:56:31.341255 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:31.340958 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wrrx5" Apr 24 23:56:31.358684 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:31.358636 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wrrx5" podStartSLOduration=129.108085531 podStartE2EDuration="2m10.358623685s" podCreationTimestamp="2026-04-24 23:54:21 +0000 UTC" firstStartedPulling="2026-04-24 23:56:29.251541068 +0000 UTC m=+162.025973924" lastFinishedPulling="2026-04-24 23:56:30.502079207 +0000 UTC m=+163.276512078" observedRunningTime="2026-04-24 23:56:31.357894999 +0000 UTC m=+164.132327877" watchObservedRunningTime="2026-04-24 23:56:31.358623685 +0000 UTC m=+164.133056562" Apr 24 23:56:34.972760 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.972725 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8sjf5"] Apr 24 23:56:34.975808 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.975789 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:34.978096 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.978074 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 24 23:56:34.978096 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.978084 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-wlqd9\"" Apr 24 23:56:34.978286 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.978084 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 23:56:34.978286 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.978245 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 23:56:34.979019 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.978998 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 23:56:34.979130 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.979022 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 24 23:56:34.979130 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.978999 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 24 23:56:34.986719 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.986699 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8sjf5"] Apr 24 23:56:34.989257 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.989237 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-vhwmg"] Apr 24 23:56:34.992148 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.992130 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:34.994211 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.994190 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 23:56:34.994316 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.994189 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 23:56:34.994316 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.994194 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hmz5f\"" Apr 24 23:56:34.994442 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:34.994324 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 23:56:35.040350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzphm\" (UniqueName: \"kubernetes.io/projected/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-api-access-mzphm\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.040350 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040348 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7s9b\" (UniqueName: \"kubernetes.io/projected/0255687a-51ff-45c6-acc3-b0d741369751-kube-api-access-q7s9b\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.040530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-tls\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.040530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040438 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-wtmp\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.040530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.040530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040491 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-textfile\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.040530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040507 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.040934 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040874 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/460ce853-a580-46e1-96ce-4bfa4d5485fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.041025 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.040959 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460ce853-a580-46e1-96ce-4bfa4d5485fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.041025 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.041012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.041125 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.041061 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-accelerators-collector-config\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.041209 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.041189 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.041265 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.041247 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-root\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.041314 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.041282 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0255687a-51ff-45c6-acc3-b0d741369751-metrics-client-ca\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.041364 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.041326 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-sys\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.142589 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142551 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.142768 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142607 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-accelerators-collector-config\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.142768 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142648 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.142768 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142680 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-root\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.142768 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142705 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0255687a-51ff-45c6-acc3-b0d741369751-metrics-client-ca\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.142768 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142737 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-sys\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-sys\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142791 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-root\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzphm\" (UniqueName: \"kubernetes.io/projected/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-api-access-mzphm\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q7s9b\" (UniqueName: \"kubernetes.io/projected/0255687a-51ff-45c6-acc3-b0d741369751-kube-api-access-q7s9b\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.142962 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-tls\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-wtmp\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:56:35.143072 2572 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 23:56:35.143112 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143126 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-textfile\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:56:35.143145 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-tls podName:0255687a-51ff-45c6-acc3-b0d741369751 nodeName:}" failed. No retries permitted until 2026-04-24 23:56:35.643124433 +0000 UTC m=+168.417557302 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-tls") pod "node-exporter-vhwmg" (UID: "0255687a-51ff-45c6-acc3-b0d741369751") : secret "node-exporter-tls" not found Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143175 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:56:35.143202 2572 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143208 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/460ce853-a580-46e1-96ce-4bfa4d5485fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143237 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460ce853-a580-46e1-96ce-4bfa4d5485fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:56:35.143249 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-tls podName:460ce853-a580-46e1-96ce-4bfa4d5485fa nodeName:}" failed. No retries permitted until 2026-04-24 23:56:35.643236621 +0000 UTC m=+168.417669484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-8sjf5" (UID: "460ce853-a580-46e1-96ce-4bfa4d5485fa") : secret "kube-state-metrics-tls" not found Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143248 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-wtmp\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143388 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-accelerators-collector-config\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143539 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143527 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0255687a-51ff-45c6-acc3-b0d741369751-metrics-client-ca\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.143882 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.143863 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/460ce853-a580-46e1-96ce-4bfa4d5485fa-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.144071 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.144053 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.144163 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.144144 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-textfile\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.144230 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.144211 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/460ce853-a580-46e1-96ce-4bfa4d5485fa-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.145345 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.145320 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.145442 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.145328 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.154212 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.154183 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7s9b\" (UniqueName: \"kubernetes.io/projected/0255687a-51ff-45c6-acc3-b0d741369751-kube-api-access-q7s9b\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.154413 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.154389 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzphm\" (UniqueName: \"kubernetes.io/projected/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-api-access-mzphm\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.647128 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.647086 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-tls\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.647337 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.647134 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.649435 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.649413 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0255687a-51ff-45c6-acc3-b0d741369751-node-exporter-tls\") pod \"node-exporter-vhwmg\" (UID: \"0255687a-51ff-45c6-acc3-b0d741369751\") " pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.649582 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.649561 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/460ce853-a580-46e1-96ce-4bfa4d5485fa-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-8sjf5\" (UID: \"460ce853-a580-46e1-96ce-4bfa4d5485fa\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.823785 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.823753 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:56:35.826193 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.826171 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-h7snm\"" Apr 24 23:56:35.834944 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.834921 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bm675" Apr 24 23:56:35.885530 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.885501 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" Apr 24 23:56:35.901456 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.901396 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-vhwmg" Apr 24 23:56:35.913361 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:56:35.913286 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0255687a_51ff_45c6_acc3_b0d741369751.slice/crio-70c27bb0972a5043b117536684e9c5b0efcc4f1d6ef5909e1acb30eb51577500 WatchSource:0}: Error finding container 70c27bb0972a5043b117536684e9c5b0efcc4f1d6ef5909e1acb30eb51577500: Status 404 returned error can't find the container with id 70c27bb0972a5043b117536684e9c5b0efcc4f1d6ef5909e1acb30eb51577500 Apr 24 23:56:35.968800 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:35.967958 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bm675"] Apr 24 23:56:35.970953 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:56:35.970924 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d0287de_4bd1_4d95_adbc_1ee225e3d1b2.slice/crio-ee55913ef56a226f2744acb26b1547a61154a38b6bb3291009ccc9a509ab6140 WatchSource:0}: Error finding container ee55913ef56a226f2744acb26b1547a61154a38b6bb3291009ccc9a509ab6140: Status 404 returned error can't find the container with id ee55913ef56a226f2744acb26b1547a61154a38b6bb3291009ccc9a509ab6140 Apr 24 23:56:36.024223 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:36.024192 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-8sjf5"] Apr 24 23:56:36.028153 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:56:36.028126 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod460ce853_a580_46e1_96ce_4bfa4d5485fa.slice/crio-a3149d5fd5a1f943c791aaf89f81f648ea803b12389300352f1e77830984add4 WatchSource:0}: Error finding container a3149d5fd5a1f943c791aaf89f81f648ea803b12389300352f1e77830984add4: Status 404 returned error can't find the container with id a3149d5fd5a1f943c791aaf89f81f648ea803b12389300352f1e77830984add4 Apr 24 23:56:36.355974 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:36.355892 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhwmg" event={"ID":"0255687a-51ff-45c6-acc3-b0d741369751","Type":"ContainerStarted","Data":"70c27bb0972a5043b117536684e9c5b0efcc4f1d6ef5909e1acb30eb51577500"} Apr 24 23:56:36.357456 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:36.357431 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" event={"ID":"460ce853-a580-46e1-96ce-4bfa4d5485fa","Type":"ContainerStarted","Data":"a3149d5fd5a1f943c791aaf89f81f648ea803b12389300352f1e77830984add4"} Apr 24 23:56:36.358728 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:36.358702 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bm675" event={"ID":"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2","Type":"ContainerStarted","Data":"ee55913ef56a226f2744acb26b1547a61154a38b6bb3291009ccc9a509ab6140"} Apr 24 23:56:37.362651 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:37.362611 2572 generic.go:358] "Generic (PLEG): container finished" podID="0255687a-51ff-45c6-acc3-b0d741369751" containerID="cbd5866c66befe4c93a8fc14e8652548a968085c42074be0b082a2d70d097689" exitCode=0 Apr 24 23:56:37.363106 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:37.362701 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhwmg" event={"ID":"0255687a-51ff-45c6-acc3-b0d741369751","Type":"ContainerDied","Data":"cbd5866c66befe4c93a8fc14e8652548a968085c42074be0b082a2d70d097689"} Apr 24 23:56:38.308236 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.308204 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5f4446d468-6llfg" Apr 24 23:56:38.366985 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.366948 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bm675" event={"ID":"0d0287de-4bd1-4d95-adbc-1ee225e3d1b2","Type":"ContainerStarted","Data":"01cb50710ac127593b6c53237c81d1234b3bed18165472e8582573a99dfab43f"} Apr 24 23:56:38.368822 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.368800 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhwmg" event={"ID":"0255687a-51ff-45c6-acc3-b0d741369751","Type":"ContainerStarted","Data":"64217dda3b732c02c2c79e77194c027c2a06b28d79a52801842f52997a922f24"} Apr 24 23:56:38.368897 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.368827 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-vhwmg" event={"ID":"0255687a-51ff-45c6-acc3-b0d741369751","Type":"ContainerStarted","Data":"7467ca1f9e021212631043533a69cf2859c72d24935671a7fbb19872212ea5f4"} Apr 24 23:56:38.370948 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.370923 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" event={"ID":"460ce853-a580-46e1-96ce-4bfa4d5485fa","Type":"ContainerStarted","Data":"c1c88c73a015c1ea6bbde7ac6293c2a63337ecd373487a16c7ab2e66ac60337b"} Apr 24 23:56:38.371034 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.370952 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" event={"ID":"460ce853-a580-46e1-96ce-4bfa4d5485fa","Type":"ContainerStarted","Data":"8ad8bfea82a9924e98bc723bc6d5f2a4bce81dee52aee87831fd4ab65255b0ff"} Apr 24 23:56:38.371034 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.370961 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" event={"ID":"460ce853-a580-46e1-96ce-4bfa4d5485fa","Type":"ContainerStarted","Data":"f8587c75ccb2704f0d7303f70d8d666d36e45912fead8f02fa6e50e074b3c14b"} Apr 24 23:56:38.387543 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.387497 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bm675" podStartSLOduration=135.606504202 podStartE2EDuration="2m17.38748172s" podCreationTimestamp="2026-04-24 23:54:21 +0000 UTC" firstStartedPulling="2026-04-24 23:56:35.972966675 +0000 UTC m=+168.747399532" lastFinishedPulling="2026-04-24 23:56:37.753944181 +0000 UTC m=+170.528377050" observedRunningTime="2026-04-24 23:56:38.386452522 +0000 UTC m=+171.160885424" watchObservedRunningTime="2026-04-24 23:56:38.38748172 +0000 UTC m=+171.161914598" Apr 24 23:56:38.414655 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.414598 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-vhwmg" podStartSLOduration=3.61541389 podStartE2EDuration="4.414581652s" podCreationTimestamp="2026-04-24 23:56:34 +0000 UTC" firstStartedPulling="2026-04-24 23:56:35.915237795 +0000 UTC m=+168.689670668" lastFinishedPulling="2026-04-24 23:56:36.714405557 +0000 UTC m=+169.488838430" observedRunningTime="2026-04-24 23:56:38.413823886 +0000 UTC m=+171.188256765" watchObservedRunningTime="2026-04-24 23:56:38.414581652 +0000 UTC m=+171.189014530" Apr 24 23:56:38.437041 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:38.436987 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-8sjf5" podStartSLOduration=2.718182867 podStartE2EDuration="4.436968512s" podCreationTimestamp="2026-04-24 23:56:34 +0000 UTC" firstStartedPulling="2026-04-24 23:56:36.030081157 +0000 UTC m=+168.804514024" lastFinishedPulling="2026-04-24 23:56:37.748866814 +0000 UTC m=+170.523299669" observedRunningTime="2026-04-24 23:56:38.436635224 +0000 UTC m=+171.211068103" watchObservedRunningTime="2026-04-24 23:56:38.436968512 +0000 UTC m=+171.211401391" Apr 24 23:56:39.823756 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:39.823720 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:56:41.159671 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.159639 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:41.163238 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.163220 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.165752 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.165726 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 23:56:41.165883 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.165851 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-x26dp\"" Apr 24 23:56:41.167590 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.166408 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5p85qd82ecjrg\"" Apr 24 23:56:41.167590 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.166697 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 23:56:41.167590 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.167470 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 23:56:41.167798 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.167702 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 23:56:41.167896 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.167876 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 23:56:41.168138 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.168116 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 23:56:41.168363 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.168341 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 23:56:41.168469 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.168446 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 23:56:41.168947 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.168900 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 23:56:41.169457 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.169435 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 23:56:41.169961 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.169940 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 23:56:41.173022 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.173004 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 23:56:41.174439 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.174416 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 23:56:41.180944 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.180921 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:41.291992 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.291961 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.291992 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.291997 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292024 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292047 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292080 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292104 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292147 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292189 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292167 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292507 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-config-out\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292507 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292301 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292507 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292356 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292507 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292404 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292507 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292437 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292507 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292462 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llcn2\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-kube-api-access-llcn2\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292507 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292489 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292731 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-web-config\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.292731 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.292549 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-config\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.346881 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.346852 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wrrx5" Apr 24 23:56:41.393436 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393405 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393436 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393440 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393465 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-config-out\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393498 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393532 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393662 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393610 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llcn2\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-kube-api-access-llcn2\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.393972 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393947 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394036 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.393997 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-web-config\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394036 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394023 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-config\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394137 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394094 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394137 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394233 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394147 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394233 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394233 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394201 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394395 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394257 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394395 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394287 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394395 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394300 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394551 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394396 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.394640 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.394612 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.397343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.396148 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.397343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.397019 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.397343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.397293 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.397562 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.397391 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-config-out\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.398134 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.398106 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.398848 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.398588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.398848 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.398641 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.398848 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.398779 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.399061 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.399000 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.399578 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.399554 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-web-config\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.399801 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.399780 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-config\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.401022 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.400995 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.401236 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.401217 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.401491 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.401472 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.401529 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.401487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llcn2\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-kube-api-access-llcn2\") pod \"prometheus-k8s-0\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.476641 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.476609 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:56:41.600853 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:41.600803 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:56:41.604769 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:56:41.604735 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804465ea_5e38_456b_b233_bb8e8866411f.slice/crio-933d783b1bc28c59ce5b850d1a6bc52de9eb26ac08f6cae1e5cb0bf9b4fd7801 WatchSource:0}: Error finding container 933d783b1bc28c59ce5b850d1a6bc52de9eb26ac08f6cae1e5cb0bf9b4fd7801: Status 404 returned error can't find the container with id 933d783b1bc28c59ce5b850d1a6bc52de9eb26ac08f6cae1e5cb0bf9b4fd7801 Apr 24 23:56:42.381622 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:42.381578 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerStarted","Data":"933d783b1bc28c59ce5b850d1a6bc52de9eb26ac08f6cae1e5cb0bf9b4fd7801"} Apr 24 23:56:43.385675 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:43.385618 2572 generic.go:358] "Generic (PLEG): container finished" podID="804465ea-5e38-456b-b233-bb8e8866411f" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" exitCode=0 Apr 24 23:56:43.386086 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:43.385711 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7"} Apr 24 23:56:46.396660 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:46.396611 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerStarted","Data":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} Apr 24 23:56:46.396660 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:46.396656 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerStarted","Data":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} Apr 24 23:56:48.405222 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:48.405161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerStarted","Data":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} Apr 24 23:56:48.405222 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:48.405224 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerStarted","Data":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} Apr 24 23:56:48.405615 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:48.405239 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerStarted","Data":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} Apr 24 23:56:48.405615 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:48.405252 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerStarted","Data":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} Apr 24 23:56:48.436834 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:48.436783 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.510506098 podStartE2EDuration="7.436768171s" podCreationTimestamp="2026-04-24 23:56:41 +0000 UTC" firstStartedPulling="2026-04-24 23:56:41.606557398 +0000 UTC m=+174.380990254" lastFinishedPulling="2026-04-24 23:56:47.532819468 +0000 UTC m=+180.307252327" observedRunningTime="2026-04-24 23:56:48.434628496 +0000 UTC m=+181.209061374" watchObservedRunningTime="2026-04-24 23:56:48.436768171 +0000 UTC m=+181.211201049" Apr 24 23:56:51.477032 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:56:51.476990 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:13.906500 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:13.906437 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" podUID="61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 23:57:23.906422 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:23.906379 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" podUID="61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 23:57:33.906897 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:33.906855 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" podUID="61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 23:57:33.907304 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:33.906956 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" Apr 24 23:57:33.907450 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:33.907418 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"952eaa8d6b13b09e72e3cf89687621592c10b1e6453de89a1710af18749fd06d"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 23:57:33.907497 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:33.907483 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" podUID="61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9" containerName="service-proxy" containerID="cri-o://952eaa8d6b13b09e72e3cf89687621592c10b1e6453de89a1710af18749fd06d" gracePeriod=30 Apr 24 23:57:34.529708 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:34.529674 2572 generic.go:358] "Generic (PLEG): container finished" podID="61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9" containerID="952eaa8d6b13b09e72e3cf89687621592c10b1e6453de89a1710af18749fd06d" exitCode=2 Apr 24 23:57:34.529873 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:34.529747 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" event={"ID":"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9","Type":"ContainerDied","Data":"952eaa8d6b13b09e72e3cf89687621592c10b1e6453de89a1710af18749fd06d"} Apr 24 23:57:34.529873 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:34.529783 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5c44f9d677-qcs8l" event={"ID":"61dcd8d1-8c0c-4585-ac6f-5757afd2b8a9","Type":"ContainerStarted","Data":"f9408eb1a27cf38713210e93c44be6c43a66959a97920bca444e88f2e75cc01c"} Apr 24 23:57:41.476795 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:41.476747 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:41.495846 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:41.495812 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:41.562991 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:41.562961 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:59.541149 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.541111 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:57:59.541659 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.541582 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="prometheus" containerID="cri-o://253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" gracePeriod=600 Apr 24 23:57:59.541736 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.541626 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy" containerID="cri-o://fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" gracePeriod=600 Apr 24 23:57:59.541736 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.541652 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="config-reloader" containerID="cri-o://3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" gracePeriod=600 Apr 24 23:57:59.541862 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.541625 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-thanos" containerID="cri-o://2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" gracePeriod=600 Apr 24 23:57:59.541862 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.541636 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="thanos-sidecar" containerID="cri-o://f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" gracePeriod=600 Apr 24 23:57:59.542166 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.541647 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-web" containerID="cri-o://3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" gracePeriod=600 Apr 24 23:57:59.651742 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.651670 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:57:59.654558 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.654428 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ce242e4-92d1-4ff1-8276-05d4293cfb10-metrics-certs\") pod \"network-metrics-daemon-xgtzt\" (UID: \"6ce242e4-92d1-4ff1-8276-05d4293cfb10\") " pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:57:59.795291 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.795230 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:57:59.927144 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.927106 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-jk9j5\"" Apr 24 23:57:59.934885 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.934862 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-xgtzt" Apr 24 23:57:59.953916 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.953875 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-metrics-client-certs\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954070 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.953938 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-db\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954070 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954045 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-rulefiles-0\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954190 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954086 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954190 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954117 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-grpc-tls\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954190 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954160 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llcn2\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-kube-api-access-llcn2\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954334 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954191 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-metrics-client-ca\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954334 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954217 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-thanos-prometheus-http-client-file\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954334 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954243 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-web-config\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954334 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954283 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-tls-assets\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954334 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954313 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-serving-certs-ca-bundle\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954566 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954357 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-tls\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954566 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954384 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-config\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954566 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954425 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-kube-rbac-proxy\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954566 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954478 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-trusted-ca-bundle\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954566 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954505 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-config-out\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954566 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954531 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-kubelet-serving-ca-bundle\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.954566 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.954559 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"804465ea-5e38-456b-b233-bb8e8866411f\" (UID: \"804465ea-5e38-456b-b233-bb8e8866411f\") " Apr 24 23:57:59.955356 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.955323 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:59.955356 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.955340 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:59.955524 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.955404 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:59.956402 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.956151 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:59.957237 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.957148 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.957336 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.957307 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:59.957401 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.957341 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 23:57:59.957703 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.957651 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.957952 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.957876 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:59.959082 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.959052 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.959175 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.959151 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.959237 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.959205 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.959287 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.959274 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.959344 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.959297 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.959484 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.959457 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-config-out" (OuterVolumeSpecName: "config-out") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 23:57:59.960162 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.960137 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-kube-api-access-llcn2" (OuterVolumeSpecName: "kube-api-access-llcn2") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "kube-api-access-llcn2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 23:57:59.960244 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.960180 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-config" (OuterVolumeSpecName: "config") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:57:59.969701 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:57:59.969675 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-web-config" (OuterVolumeSpecName: "web-config") pod "804465ea-5e38-456b-b233-bb8e8866411f" (UID: "804465ea-5e38-456b-b233-bb8e8866411f"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 23:58:00.052543 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.052455 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-xgtzt"] Apr 24 23:58:00.055235 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055211 2572 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-kube-rbac-proxy\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055237 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-trusted-ca-bundle\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055253 2572 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-config-out\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055266 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055281 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055292 2572 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-metrics-client-certs\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055301 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-db\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055310 2572 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055318 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055327 2572 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-grpc-tls\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055336 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-llcn2\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-kube-api-access-llcn2\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055343 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055346 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-metrics-client-ca\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055359 2572 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-thanos-prometheus-http-client-file\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055368 2572 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-web-config\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055376 2572 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/804465ea-5e38-456b-b233-bb8e8866411f-tls-assets\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055383 2572 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/804465ea-5e38-456b-b233-bb8e8866411f-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055393 2572 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-secret-prometheus-k8s-tls\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055697 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.055405 2572 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/804465ea-5e38-456b-b233-bb8e8866411f-config\") on node \"ip-10-0-138-5.ec2.internal\" DevicePath \"\"" Apr 24 23:58:00.055697 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:58:00.055487 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ce242e4_92d1_4ff1_8276_05d4293cfb10.slice/crio-8baf09086018d0200b11a71c4cc6f10eec2c651c4636046da88d235c8054ee35 WatchSource:0}: Error finding container 8baf09086018d0200b11a71c4cc6f10eec2c651c4636046da88d235c8054ee35: Status 404 returned error can't find the container with id 8baf09086018d0200b11a71c4cc6f10eec2c651c4636046da88d235c8054ee35 Apr 24 23:58:00.602822 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602788 2572 generic.go:358] "Generic (PLEG): container finished" podID="804465ea-5e38-456b-b233-bb8e8866411f" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" exitCode=0 Apr 24 23:58:00.602822 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602818 2572 generic.go:358] "Generic (PLEG): container finished" podID="804465ea-5e38-456b-b233-bb8e8866411f" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" exitCode=0 Apr 24 23:58:00.602822 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602826 2572 generic.go:358] "Generic (PLEG): container finished" podID="804465ea-5e38-456b-b233-bb8e8866411f" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" exitCode=0 Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602834 2572 generic.go:358] "Generic (PLEG): container finished" podID="804465ea-5e38-456b-b233-bb8e8866411f" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" exitCode=0 Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602842 2572 generic.go:358] "Generic (PLEG): container finished" podID="804465ea-5e38-456b-b233-bb8e8866411f" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" exitCode=0 Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602850 2572 generic.go:358] "Generic (PLEG): container finished" podID="804465ea-5e38-456b-b233-bb8e8866411f" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" exitCode=0 Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602885 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602932 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602950 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602967 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602980 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.602994 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.603007 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.603019 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"804465ea-5e38-456b-b233-bb8e8866411f","Type":"ContainerDied","Data":"933d783b1bc28c59ce5b850d1a6bc52de9eb26ac08f6cae1e5cb0bf9b4fd7801"} Apr 24 23:58:00.603380 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.603023 2572 scope.go:117] "RemoveContainer" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.604233 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.604079 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xgtzt" event={"ID":"6ce242e4-92d1-4ff1-8276-05d4293cfb10","Type":"ContainerStarted","Data":"8baf09086018d0200b11a71c4cc6f10eec2c651c4636046da88d235c8054ee35"} Apr 24 23:58:00.612315 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.612244 2572 scope.go:117] "RemoveContainer" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.619942 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.619924 2572 scope.go:117] "RemoveContainer" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.627389 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.627330 2572 scope.go:117] "RemoveContainer" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.628815 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.628794 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:00.632517 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.632494 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:00.635222 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.635180 2572 scope.go:117] "RemoveContainer" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.643642 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.643624 2572 scope.go:117] "RemoveContainer" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.651389 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.651339 2572 scope.go:117] "RemoveContainer" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.657175 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657152 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:00.657488 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657471 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="config-reloader" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657492 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="config-reloader" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657508 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-web" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657517 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-web" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657529 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657539 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657550 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-thanos" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657559 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-thanos" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657570 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="prometheus" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657575 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="prometheus" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657586 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="init-config-reloader" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657595 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="init-config-reloader" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657602 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="thanos-sidecar" Apr 24 23:58:00.657637 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657607 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="thanos-sidecar" Apr 24 23:58:00.658421 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657676 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="thanos-sidecar" Apr 24 23:58:00.658421 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657688 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-web" Apr 24 23:58:00.658421 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657696 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy-thanos" Apr 24 23:58:00.658421 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657708 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="prometheus" Apr 24 23:58:00.658421 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657715 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="config-reloader" Apr 24 23:58:00.658421 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.657721 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="804465ea-5e38-456b-b233-bb8e8866411f" containerName="kube-rbac-proxy" Apr 24 23:58:00.659453 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.659431 2572 scope.go:117] "RemoveContainer" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.659883 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:58:00.659841 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": container with ID starting with 2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9 not found: ID does not exist" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.659991 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.659878 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} err="failed to get container status \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": rpc error: code = NotFound desc = could not find container \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": container with ID starting with 2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9 not found: ID does not exist" Apr 24 23:58:00.659991 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.659929 2572 scope.go:117] "RemoveContainer" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.660247 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:58:00.660223 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": container with ID starting with fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449 not found: ID does not exist" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.660306 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.660259 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} err="failed to get container status \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": rpc error: code = NotFound desc = could not find container \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": container with ID starting with fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449 not found: ID does not exist" Apr 24 23:58:00.660306 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.660283 2572 scope.go:117] "RemoveContainer" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.660529 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:58:00.660507 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": container with ID starting with 3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c not found: ID does not exist" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.660579 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.660536 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} err="failed to get container status \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": rpc error: code = NotFound desc = could not find container \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": container with ID starting with 3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c not found: ID does not exist" Apr 24 23:58:00.660579 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.660554 2572 scope.go:117] "RemoveContainer" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.660829 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:58:00.660811 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": container with ID starting with f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b not found: ID does not exist" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.660894 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.660838 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} err="failed to get container status \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": rpc error: code = NotFound desc = could not find container \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": container with ID starting with f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b not found: ID does not exist" Apr 24 23:58:00.660894 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.660854 2572 scope.go:117] "RemoveContainer" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.661143 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:58:00.661119 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": container with ID starting with 3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68 not found: ID does not exist" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.661222 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.661147 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} err="failed to get container status \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": rpc error: code = NotFound desc = could not find container \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": container with ID starting with 3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68 not found: ID does not exist" Apr 24 23:58:00.661222 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.661162 2572 scope.go:117] "RemoveContainer" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.661401 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:58:00.661379 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": container with ID starting with 253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759 not found: ID does not exist" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.661454 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.661408 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} err="failed to get container status \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": rpc error: code = NotFound desc = could not find container \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": container with ID starting with 253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759 not found: ID does not exist" Apr 24 23:58:00.661454 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.661428 2572 scope.go:117] "RemoveContainer" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.661700 ip-10-0-138-5 kubenswrapper[2572]: E0424 23:58:00.661676 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": container with ID starting with 5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7 not found: ID does not exist" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.661778 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.661705 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7"} err="failed to get container status \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": rpc error: code = NotFound desc = could not find container \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": container with ID starting with 5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7 not found: ID does not exist" Apr 24 23:58:00.661778 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.661719 2572 scope.go:117] "RemoveContainer" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.662023 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662002 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} err="failed to get container status \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": rpc error: code = NotFound desc = could not find container \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": container with ID starting with 2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9 not found: ID does not exist" Apr 24 23:58:00.662110 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662024 2572 scope.go:117] "RemoveContainer" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.662319 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662289 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} err="failed to get container status \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": rpc error: code = NotFound desc = could not find container \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": container with ID starting with fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449 not found: ID does not exist" Apr 24 23:58:00.662319 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662319 2572 scope.go:117] "RemoveContainer" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.662624 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662593 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} err="failed to get container status \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": rpc error: code = NotFound desc = could not find container \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": container with ID starting with 3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c not found: ID does not exist" Apr 24 23:58:00.662712 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662625 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.662712 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662624 2572 scope.go:117] "RemoveContainer" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.662993 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662936 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} err="failed to get container status \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": rpc error: code = NotFound desc = could not find container \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": container with ID starting with f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b not found: ID does not exist" Apr 24 23:58:00.663064 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.662995 2572 scope.go:117] "RemoveContainer" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.663276 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.663208 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} err="failed to get container status \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": rpc error: code = NotFound desc = could not find container \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": container with ID starting with 3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68 not found: ID does not exist" Apr 24 23:58:00.663276 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.663233 2572 scope.go:117] "RemoveContainer" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.663541 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.663511 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} err="failed to get container status \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": rpc error: code = NotFound desc = could not find container \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": container with ID starting with 253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759 not found: ID does not exist" Apr 24 23:58:00.663541 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.663537 2572 scope.go:117] "RemoveContainer" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.663818 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.663768 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7"} err="failed to get container status \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": rpc error: code = NotFound desc = could not find container \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": container with ID starting with 5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7 not found: ID does not exist" Apr 24 23:58:00.663927 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.663817 2572 scope.go:117] "RemoveContainer" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.664102 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.664076 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} err="failed to get container status \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": rpc error: code = NotFound desc = could not find container \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": container with ID starting with 2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9 not found: ID does not exist" Apr 24 23:58:00.664102 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.664103 2572 scope.go:117] "RemoveContainer" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.664392 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.664365 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} err="failed to get container status \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": rpc error: code = NotFound desc = could not find container \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": container with ID starting with fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449 not found: ID does not exist" Apr 24 23:58:00.664489 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.664393 2572 scope.go:117] "RemoveContainer" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.664725 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.664697 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} err="failed to get container status \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": rpc error: code = NotFound desc = could not find container \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": container with ID starting with 3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c not found: ID does not exist" Apr 24 23:58:00.664820 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.664726 2572 scope.go:117] "RemoveContainer" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.665001 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.664974 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} err="failed to get container status \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": rpc error: code = NotFound desc = could not find container \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": container with ID starting with f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b not found: ID does not exist" Apr 24 23:58:00.665074 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665001 2572 scope.go:117] "RemoveContainer" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.665122 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665082 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-x26dp\"" Apr 24 23:58:00.665122 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665111 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 24 23:58:00.665312 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665296 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 24 23:58:00.665349 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665313 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} err="failed to get container status \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": rpc error: code = NotFound desc = could not find container \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": container with ID starting with 3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68 not found: ID does not exist" Apr 24 23:58:00.665349 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665336 2572 scope.go:117] "RemoveContainer" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.665436 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665335 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 24 23:58:00.665436 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665382 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-5p85qd82ecjrg\"" Apr 24 23:58:00.665961 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665585 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 24 23:58:00.665961 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665592 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 24 23:58:00.665961 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665683 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 24 23:58:00.665961 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665720 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 24 23:58:00.665961 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665782 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 24 23:58:00.666231 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665956 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} err="failed to get container status \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": rpc error: code = NotFound desc = could not find container \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": container with ID starting with 253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759 not found: ID does not exist" Apr 24 23:58:00.666231 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.665981 2572 scope.go:117] "RemoveContainer" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.666231 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.666006 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 24 23:58:00.666231 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.666037 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 24 23:58:00.666426 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.666304 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 24 23:58:00.666591 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.666562 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7"} err="failed to get container status \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": rpc error: code = NotFound desc = could not find container \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": container with ID starting with 5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7 not found: ID does not exist" Apr 24 23:58:00.666674 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.666593 2572 scope.go:117] "RemoveContainer" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.667059 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.666971 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} err="failed to get container status \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": rpc error: code = NotFound desc = could not find container \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": container with ID starting with 2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9 not found: ID does not exist" Apr 24 23:58:00.667059 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.666998 2572 scope.go:117] "RemoveContainer" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.667300 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.667279 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} err="failed to get container status \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": rpc error: code = NotFound desc = could not find container \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": container with ID starting with fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449 not found: ID does not exist" Apr 24 23:58:00.667401 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.667301 2572 scope.go:117] "RemoveContainer" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.667591 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.667537 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} err="failed to get container status \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": rpc error: code = NotFound desc = could not find container \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": container with ID starting with 3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c not found: ID does not exist" Apr 24 23:58:00.667591 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.667561 2572 scope.go:117] "RemoveContainer" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.667972 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.667926 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} err="failed to get container status \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": rpc error: code = NotFound desc = could not find container \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": container with ID starting with f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b not found: ID does not exist" Apr 24 23:58:00.667972 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.667965 2572 scope.go:117] "RemoveContainer" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.669487 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.669456 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} err="failed to get container status \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": rpc error: code = NotFound desc = could not find container \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": container with ID starting with 3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68 not found: ID does not exist" Apr 24 23:58:00.669586 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.669488 2572 scope.go:117] "RemoveContainer" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.670011 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.669986 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} err="failed to get container status \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": rpc error: code = NotFound desc = could not find container \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": container with ID starting with 253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759 not found: ID does not exist" Apr 24 23:58:00.670100 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.670013 2572 scope.go:117] "RemoveContainer" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.670291 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7"} err="failed to get container status \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": rpc error: code = NotFound desc = could not find container \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": container with ID starting with 5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7 not found: ID does not exist" Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.670314 2572 scope.go:117] "RemoveContainer" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.674697 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} err="failed to get container status \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": rpc error: code = NotFound desc = could not find container \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": container with ID starting with 2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9 not found: ID does not exist" Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.674722 2572 scope.go:117] "RemoveContainer" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.674751 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.674853 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.675157 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} err="failed to get container status \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": rpc error: code = NotFound desc = could not find container \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": container with ID starting with fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449 not found: ID does not exist" Apr 24 23:58:00.676372 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.675181 2572 scope.go:117] "RemoveContainer" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.676964 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.676839 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 24 23:58:00.677255 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.677228 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} err="failed to get container status \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": rpc error: code = NotFound desc = could not find container \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": container with ID starting with 3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c not found: ID does not exist" Apr 24 23:58:00.677344 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.677257 2572 scope.go:117] "RemoveContainer" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.677808 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.677735 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} err="failed to get container status \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": rpc error: code = NotFound desc = could not find container \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": container with ID starting with f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b not found: ID does not exist" Apr 24 23:58:00.677942 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.677810 2572 scope.go:117] "RemoveContainer" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.678598 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.678503 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} err="failed to get container status \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": rpc error: code = NotFound desc = could not find container \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": container with ID starting with 3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68 not found: ID does not exist" Apr 24 23:58:00.678598 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.678533 2572 scope.go:117] "RemoveContainer" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.678988 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.678961 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} err="failed to get container status \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": rpc error: code = NotFound desc = could not find container \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": container with ID starting with 253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759 not found: ID does not exist" Apr 24 23:58:00.679082 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.679004 2572 scope.go:117] "RemoveContainer" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.679746 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.679695 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7"} err="failed to get container status \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": rpc error: code = NotFound desc = could not find container \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": container with ID starting with 5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7 not found: ID does not exist" Apr 24 23:58:00.679746 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.679724 2572 scope.go:117] "RemoveContainer" containerID="2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9" Apr 24 23:58:00.680025 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.679988 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9"} err="failed to get container status \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": rpc error: code = NotFound desc = could not find container \"2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9\": container with ID starting with 2892ba1c28f7d320e2308b5ff0784f27a9decb7c37602186dc5db6158bfbf5d9 not found: ID does not exist" Apr 24 23:58:00.680025 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.680013 2572 scope.go:117] "RemoveContainer" containerID="fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449" Apr 24 23:58:00.680231 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.680213 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449"} err="failed to get container status \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": rpc error: code = NotFound desc = could not find container \"fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449\": container with ID starting with fde1e88885ce2dc86045369123167107be98e57466fbf351da81503cab569449 not found: ID does not exist" Apr 24 23:58:00.680290 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.680233 2572 scope.go:117] "RemoveContainer" containerID="3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c" Apr 24 23:58:00.680518 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.680490 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c"} err="failed to get container status \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": rpc error: code = NotFound desc = could not find container \"3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c\": container with ID starting with 3fd5a133bf5e2850b26d7b5a3a19ca204588bfdb54d4a00bef97db492cff723c not found: ID does not exist" Apr 24 23:58:00.680518 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.680518 2572 scope.go:117] "RemoveContainer" containerID="f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b" Apr 24 23:58:00.680789 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.680753 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b"} err="failed to get container status \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": rpc error: code = NotFound desc = could not find container \"f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b\": container with ID starting with f8b125e0e73520f95bd93b1bff555a36ad6220314cbe8fd2c22c5c040e864a2b not found: ID does not exist" Apr 24 23:58:00.680789 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.680789 2572 scope.go:117] "RemoveContainer" containerID="3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68" Apr 24 23:58:00.681076 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.681050 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68"} err="failed to get container status \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": rpc error: code = NotFound desc = could not find container \"3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68\": container with ID starting with 3dafdad1425cb5b70fd9681b4e2d03d220c05bd79108b404b36d170adc8f5f68 not found: ID does not exist" Apr 24 23:58:00.681076 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.681075 2572 scope.go:117] "RemoveContainer" containerID="253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759" Apr 24 23:58:00.681321 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.681300 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759"} err="failed to get container status \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": rpc error: code = NotFound desc = could not find container \"253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759\": container with ID starting with 253685c61890f0a8c02aab84032a88163608d7bb74753157afe2df0ac1cc3759 not found: ID does not exist" Apr 24 23:58:00.681403 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.681323 2572 scope.go:117] "RemoveContainer" containerID="5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7" Apr 24 23:58:00.681571 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.681552 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7"} err="failed to get container status \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": rpc error: code = NotFound desc = could not find container \"5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7\": container with ID starting with 5bf2baf93c322412a61f08cf2a0d7f0ad06d594eddd8df39886e757b8e3ce9c7 not found: ID does not exist" Apr 24 23:58:00.760945 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.760885 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761113 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.760992 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761113 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761037 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761113 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761113 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-config-out\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761345 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-config\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761345 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761191 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761345 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761236 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761345 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761269 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761535 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761337 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761535 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761389 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761535 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761415 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761535 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761535 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761535 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761535 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761526 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-web-config\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761858 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.761858 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.761570 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txtzj\" (UniqueName: \"kubernetes.io/projected/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-kube-api-access-txtzj\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862480 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862384 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862480 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862480 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862501 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862554 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862592 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862629 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-web-config\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862685 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862716 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txtzj\" (UniqueName: \"kubernetes.io/projected/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-kube-api-access-txtzj\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.862757 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862743 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863179 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862770 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863179 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862797 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863179 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862824 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863179 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862858 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-config-out\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863179 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862882 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-config\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863179 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862929 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863179 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.862967 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.863610 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.863582 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.864439 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.864166 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.866935 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.865306 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.866935 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.866224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.866935 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.866656 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.867832 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.867274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.868083 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.868061 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.868180 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.868136 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.868882 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.868784 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.868882 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.868844 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.869027 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.868927 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.869338 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.869309 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-web-config\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.869888 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.869865 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.870053 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.870031 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-config-out\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.870430 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.870406 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-config\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.870485 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.870460 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.871295 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.871274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.873136 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.873114 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txtzj\" (UniqueName: \"kubernetes.io/projected/a74bc07a-2cfe-467b-8a6f-9a8ac6648806-kube-api-access-txtzj\") pod \"prometheus-k8s-0\" (UID: \"a74bc07a-2cfe-467b-8a6f-9a8ac6648806\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:00.979407 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:00.979364 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:01.150104 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.150073 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 24 23:58:01.155017 ip-10-0-138-5 kubenswrapper[2572]: W0424 23:58:01.154981 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda74bc07a_2cfe_467b_8a6f_9a8ac6648806.slice/crio-1ff0a61ebe0807fc0a3035fc042f470e4e5966f3cb1883b496052a20273af336 WatchSource:0}: Error finding container 1ff0a61ebe0807fc0a3035fc042f470e4e5966f3cb1883b496052a20273af336: Status 404 returned error can't find the container with id 1ff0a61ebe0807fc0a3035fc042f470e4e5966f3cb1883b496052a20273af336 Apr 24 23:58:01.609219 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.609186 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xgtzt" event={"ID":"6ce242e4-92d1-4ff1-8276-05d4293cfb10","Type":"ContainerStarted","Data":"56e6f21eff705d76b83e808770785a1cb1b4eaec2d4ea6081d8ab1635d749b93"} Apr 24 23:58:01.609219 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.609223 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-xgtzt" event={"ID":"6ce242e4-92d1-4ff1-8276-05d4293cfb10","Type":"ContainerStarted","Data":"d4e36453a093d98122414e5e027650c2c41b638238233dc524434ea80f12eb11"} Apr 24 23:58:01.610559 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.610534 2572 generic.go:358] "Generic (PLEG): container finished" podID="a74bc07a-2cfe-467b-8a6f-9a8ac6648806" containerID="0646eed7499f7645f14712b71192b2d2895fa63158f6bf3142a88c17929b4812" exitCode=0 Apr 24 23:58:01.610676 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.610578 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerDied","Data":"0646eed7499f7645f14712b71192b2d2895fa63158f6bf3142a88c17929b4812"} Apr 24 23:58:01.610676 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.610625 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerStarted","Data":"1ff0a61ebe0807fc0a3035fc042f470e4e5966f3cb1883b496052a20273af336"} Apr 24 23:58:01.624872 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.624820 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-xgtzt" podStartSLOduration=253.67249372 podStartE2EDuration="4m14.624804792s" podCreationTimestamp="2026-04-24 23:53:47 +0000 UTC" firstStartedPulling="2026-04-24 23:58:00.057142995 +0000 UTC m=+252.831575851" lastFinishedPulling="2026-04-24 23:58:01.009454068 +0000 UTC m=+253.783886923" observedRunningTime="2026-04-24 23:58:01.623761747 +0000 UTC m=+254.398194625" watchObservedRunningTime="2026-04-24 23:58:01.624804792 +0000 UTC m=+254.399237671" Apr 24 23:58:01.828925 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:01.828764 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804465ea-5e38-456b-b233-bb8e8866411f" path="/var/lib/kubelet/pods/804465ea-5e38-456b-b233-bb8e8866411f/volumes" Apr 24 23:58:02.618001 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:02.617962 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerStarted","Data":"69485dedf88a8f15843fa3427215aa2b270c07a357167fb6a8d3867ee1cf9da1"} Apr 24 23:58:02.618001 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:02.618001 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerStarted","Data":"0da2e5861c0ba6a71a8334b301fa04a7b3d9370baa84a96afdcd4093f65e3d62"} Apr 24 23:58:02.618410 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:02.618011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerStarted","Data":"45247f94e4a27f5629a6fc5b38d4ec3767ce4f0707628bcb3651c622c34b403a"} Apr 24 23:58:02.618410 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:02.618020 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerStarted","Data":"38d0fed60e255d34db93e7e4c1a512fc602bdbd4a50693cab5875011a7948c50"} Apr 24 23:58:02.618410 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:02.618028 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerStarted","Data":"27437faba258205c1f789a0fe5a8f743741d8bc87c010941a781b90d3c61dc43"} Apr 24 23:58:02.618410 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:02.618038 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a74bc07a-2cfe-467b-8a6f-9a8ac6648806","Type":"ContainerStarted","Data":"1e6fb393a954eb0cc928831e6d471020581bbdef4ba486ebc792b232593111bc"} Apr 24 23:58:02.648172 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:02.648117 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.648101306 podStartE2EDuration="2.648101306s" podCreationTimestamp="2026-04-24 23:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 23:58:02.646961285 +0000 UTC m=+255.421394162" watchObservedRunningTime="2026-04-24 23:58:02.648101306 +0000 UTC m=+255.422534184" Apr 24 23:58:05.980384 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:05.980338 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:58:47.720727 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:58:47.720697 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 23:59:00.980183 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:59:00.980142 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:59:00.995299 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:59:00.995271 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 24 23:59:01.795015 ip-10-0-138-5 kubenswrapper[2572]: I0424 23:59:01.794987 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 25 00:18:39.961722 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:39.961681 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xdtfx_f82ef8db-31f3-4480-973b-af1f4b6e810e/global-pull-secret-syncer/0.log" Apr 25 00:18:40.035771 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:40.035745 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-g9jvl_926c45d5-951c-4d9f-9ec3-07d7ca1a80dc/konnectivity-agent/0.log" Apr 25 00:18:40.110422 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:40.110392 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-5.ec2.internal_e99259f9405d85ee079d839cce796346/haproxy/0.log" Apr 25 00:18:43.309613 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.309568 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8sjf5_460ce853-a580-46e1-96ce-4bfa4d5485fa/kube-state-metrics/0.log" Apr 25 00:18:43.331160 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.331129 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8sjf5_460ce853-a580-46e1-96ce-4bfa4d5485fa/kube-rbac-proxy-main/0.log" Apr 25 00:18:43.352007 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.351978 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-8sjf5_460ce853-a580-46e1-96ce-4bfa4d5485fa/kube-rbac-proxy-self/0.log" Apr 25 00:18:43.614792 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.614753 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vhwmg_0255687a-51ff-45c6-acc3-b0d741369751/node-exporter/0.log" Apr 25 00:18:43.633183 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.633158 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vhwmg_0255687a-51ff-45c6-acc3-b0d741369751/kube-rbac-proxy/0.log" Apr 25 00:18:43.653433 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.653403 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-vhwmg_0255687a-51ff-45c6-acc3-b0d741369751/init-textfile/0.log" Apr 25 00:18:43.771319 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.771271 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a74bc07a-2cfe-467b-8a6f-9a8ac6648806/prometheus/0.log" Apr 25 00:18:43.788797 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.788772 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a74bc07a-2cfe-467b-8a6f-9a8ac6648806/config-reloader/0.log" Apr 25 00:18:43.812923 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.812891 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a74bc07a-2cfe-467b-8a6f-9a8ac6648806/thanos-sidecar/0.log" Apr 25 00:18:43.832124 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.832054 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a74bc07a-2cfe-467b-8a6f-9a8ac6648806/kube-rbac-proxy-web/0.log" Apr 25 00:18:43.851443 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.851416 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a74bc07a-2cfe-467b-8a6f-9a8ac6648806/kube-rbac-proxy/0.log" Apr 25 00:18:43.870352 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.870329 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a74bc07a-2cfe-467b-8a6f-9a8ac6648806/kube-rbac-proxy-thanos/0.log" Apr 25 00:18:43.890945 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:43.890918 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a74bc07a-2cfe-467b-8a6f-9a8ac6648806/init-config-reloader/0.log" Apr 25 00:18:47.335556 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.335515 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k"] Apr 25 00:18:47.338641 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.338618 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.340696 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.340673 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sv58q\"/\"kube-root-ca.crt\"" Apr 25 00:18:47.340817 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.340718 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-sv58q\"/\"openshift-service-ca.crt\"" Apr 25 00:18:47.340817 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.340744 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-sv58q\"/\"default-dockercfg-h5ljq\"" Apr 25 00:18:47.348161 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.348137 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k"] Apr 25 00:18:47.382701 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.382673 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wrrx5_32f0c013-e25d-4e15-bbfa-6824bd7f131e/dns/0.log" Apr 25 00:18:47.400965 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.400939 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wrrx5_32f0c013-e25d-4e15-bbfa-6824bd7f131e/kube-rbac-proxy/0.log" Apr 25 00:18:47.420721 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.420693 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bcdmk_5bcba16a-2d33-4168-8eae-a6ab55719a08/dns-node-resolver/0.log" Apr 25 00:18:47.438948 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.438894 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-podres\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.439098 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.438979 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-sys\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.439098 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.439007 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-proc\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.439098 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.439043 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-lib-modules\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.439220 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.439132 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfbdk\" (UniqueName: \"kubernetes.io/projected/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-kube-api-access-tfbdk\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540413 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540365 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-lib-modules\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540610 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540430 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tfbdk\" (UniqueName: \"kubernetes.io/projected/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-kube-api-access-tfbdk\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540610 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540491 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-podres\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540610 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540540 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-lib-modules\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540610 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540545 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-sys\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540610 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540595 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-sys\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540610 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540602 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-proc\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540820 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540651 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-podres\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.540820 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.540658 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-proc\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.548211 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.548187 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfbdk\" (UniqueName: \"kubernetes.io/projected/8d6cc9e5-bafc-4841-94fb-4641a5dca9bd-kube-api-access-tfbdk\") pod \"perf-node-gather-daemonset-lmc8k\" (UID: \"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd\") " pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.650031 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.649935 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.773844 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.773801 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k"] Apr 25 00:18:47.776804 ip-10-0-138-5 kubenswrapper[2572]: W0425 00:18:47.776768 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8d6cc9e5_bafc_4841_94fb_4641a5dca9bd.slice/crio-93c9d4f0c853e2c1726f525625c280c258c5c0c51f1713c67e4a32d19f216eae WatchSource:0}: Error finding container 93c9d4f0c853e2c1726f525625c280c258c5c0c51f1713c67e4a32d19f216eae: Status 404 returned error can't find the container with id 93c9d4f0c853e2c1726f525625c280c258c5c0c51f1713c67e4a32d19f216eae Apr 25 00:18:47.778430 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.778412 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 25 00:18:47.899332 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.899289 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5f4446d468-6llfg_b9bdf6ba-b3c3-426c-8d35-5bd067e86d46/registry/0.log" Apr 25 00:18:47.958370 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.958287 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-vs22k_9646a754-da93-4e1f-9571-2b775195390b/node-ca/0.log" Apr 25 00:18:47.965736 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.965699 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" event={"ID":"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd","Type":"ContainerStarted","Data":"2d42fc418d3f62171c6c9b773b68b4a565032d79c4f000530df1ed4e52b7564b"} Apr 25 00:18:47.965736 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.965733 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" event={"ID":"8d6cc9e5-bafc-4841-94fb-4641a5dca9bd","Type":"ContainerStarted","Data":"93c9d4f0c853e2c1726f525625c280c258c5c0c51f1713c67e4a32d19f216eae"} Apr 25 00:18:47.965940 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.965852 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:47.983392 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:47.983340 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" podStartSLOduration=0.983324929 podStartE2EDuration="983.324929ms" podCreationTimestamp="2026-04-25 00:18:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-25 00:18:47.981497847 +0000 UTC m=+1500.755930726" watchObservedRunningTime="2026-04-25 00:18:47.983324929 +0000 UTC m=+1500.757757806" Apr 25 00:18:49.020101 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:49.020070 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-bm675_0d0287de-4bd1-4d95-adbc-1ee225e3d1b2/serve-healthcheck-canary/0.log" Apr 25 00:18:49.515288 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:49.515254 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jm94p_281c00f8-3ed3-4c70-b7ef-54aae75e6114/kube-rbac-proxy/0.log" Apr 25 00:18:49.537515 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:49.537486 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jm94p_281c00f8-3ed3-4c70-b7ef-54aae75e6114/exporter/0.log" Apr 25 00:18:49.557826 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:49.557797 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jm94p_281c00f8-3ed3-4c70-b7ef-54aae75e6114/extractor/0.log" Apr 25 00:18:53.978370 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:53.978344 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-sv58q/perf-node-gather-daemonset-lmc8k" Apr 25 00:18:57.060774 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.060723 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-65fxs_0818dbcb-a498-4a49-8ca5-0b677796b068/kube-multus/0.log" Apr 25 00:18:57.381858 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.381767 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qs259_c23cdd8c-e99e-473b-acb6-6602cadc65a1/kube-multus-additional-cni-plugins/0.log" Apr 25 00:18:57.399677 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.399652 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qs259_c23cdd8c-e99e-473b-acb6-6602cadc65a1/egress-router-binary-copy/0.log" Apr 25 00:18:57.417768 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.417742 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qs259_c23cdd8c-e99e-473b-acb6-6602cadc65a1/cni-plugins/0.log" Apr 25 00:18:57.435691 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.435669 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qs259_c23cdd8c-e99e-473b-acb6-6602cadc65a1/bond-cni-plugin/0.log" Apr 25 00:18:57.452943 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.452920 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qs259_c23cdd8c-e99e-473b-acb6-6602cadc65a1/routeoverride-cni/0.log" Apr 25 00:18:57.470471 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.470445 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qs259_c23cdd8c-e99e-473b-acb6-6602cadc65a1/whereabouts-cni-bincopy/0.log" Apr 25 00:18:57.488088 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.488057 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-qs259_c23cdd8c-e99e-473b-acb6-6602cadc65a1/whereabouts-cni/0.log" Apr 25 00:18:57.614766 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.614734 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xgtzt_6ce242e4-92d1-4ff1-8276-05d4293cfb10/network-metrics-daemon/0.log" Apr 25 00:18:57.632032 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:57.631967 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-xgtzt_6ce242e4-92d1-4ff1-8276-05d4293cfb10/kube-rbac-proxy/0.log" Apr 25 00:18:58.406629 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.406596 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/ovn-controller/0.log" Apr 25 00:18:58.431993 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.431957 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/ovn-acl-logging/0.log" Apr 25 00:18:58.448396 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.448367 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/kube-rbac-proxy-node/0.log" Apr 25 00:18:58.469135 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.469114 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/kube-rbac-proxy-ovn-metrics/0.log" Apr 25 00:18:58.484923 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.484886 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/northd/0.log" Apr 25 00:18:58.503413 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.503354 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/nbdb/0.log" Apr 25 00:18:58.522403 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.522385 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/sbdb/0.log" Apr 25 00:18:58.609742 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:18:58.609713 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rpfg_bbd086d5-cca3-4b01-aa4c-f76f49619285/ovnkube-controller/0.log" Apr 25 00:19:00.146180 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:19:00.146153 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-5gffj_388a768c-7e44-4c31-8196-916e3ba70a82/network-check-target-container/0.log" Apr 25 00:19:01.090692 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:19:01.090663 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-87xm9_79016484-eae2-4542-8926-e0955b9dfe90/iptables-alerter/0.log" Apr 25 00:19:01.689480 ip-10-0-138-5 kubenswrapper[2572]: I0425 00:19:01.689451 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-tcsxn_9369f8ba-c07b-4dda-864a-5be415a51468/tuned/0.log"