Apr 22 18:33:27.298817 ip-10-0-131-85 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 18:33:27.298825 ip-10-0-131-85 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 18:33:27.298832 ip-10-0-131-85 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 18:33:27.299040 ip-10-0-131-85 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 18:33:37.542565 ip-10-0-131-85 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 18:33:37.542580 ip-10-0-131-85 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 65c97d463efe4d41b8416786a127f02b -- Apr 22 18:36:01.555356 ip-10-0-131-85 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:36:01.997890 ip-10-0-131-85 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:01.997890 ip-10-0-131-85 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:36:01.997890 ip-10-0-131-85 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:01.997890 ip-10-0-131-85 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:36:01.997890 ip-10-0-131-85 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:36:02.000787 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.000697 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:36:02.004217 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004201 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:02.004217 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004217 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004221 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004224 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004228 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004231 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004234 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004238 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004241 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004244 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004246 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004249 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004252 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004255 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004258 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004261 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004264 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004266 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004276 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004280 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004283 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:02.004286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004285 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004288 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004291 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004294 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004297 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004300 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004303 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004306 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004308 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004311 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004313 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004316 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004318 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004320 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004323 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004325 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004328 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004331 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004333 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004335 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:02.004780 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004338 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004341 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004343 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004346 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004348 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004351 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004353 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004356 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004358 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004361 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004363 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004366 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004368 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004371 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004374 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004377 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004379 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004382 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004384 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:02.005270 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004387 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004389 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004392 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004394 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004397 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004400 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004403 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004405 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004408 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004412 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004414 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004417 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004419 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004421 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004424 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004426 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004429 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004431 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004436 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:02.005735 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004439 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004442 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004446 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004449 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004451 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004454 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004456 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004907 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004914 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004918 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004922 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004925 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004928 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004931 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004934 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004937 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004939 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004942 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004945 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004954 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:02.006210 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004957 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004959 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004962 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004965 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004967 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004970 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004973 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004975 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004978 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004980 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004983 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004985 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004988 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004990 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004993 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004995 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.004998 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005000 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005003 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:02.006767 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005005 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005010 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005012 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005016 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005019 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005021 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005024 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005026 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005029 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005031 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005034 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005036 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005038 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005047 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005049 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005052 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005054 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005057 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005059 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005062 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:02.007259 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005065 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005067 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005071 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005075 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005078 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005081 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005099 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005102 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005104 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005108 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005113 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005115 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005118 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005121 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005124 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005126 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005129 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005131 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005134 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:02.007756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005136 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005139 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005141 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005144 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005146 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005148 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005151 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005159 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005162 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005165 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005167 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005170 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005172 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005176 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.005179 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006406 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006420 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006429 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006434 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006438 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006441 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:36:02.008230 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006446 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006450 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006454 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006457 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006460 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006464 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006467 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006470 2571 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006473 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006476 2571 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006479 2571 flags.go:64] FLAG: --cloud-config="" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006482 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006485 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006493 2571 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006496 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006500 2571 flags.go:64] FLAG: --config-dir="" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006503 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006506 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006510 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006520 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006523 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006527 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006530 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006533 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:36:02.008743 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006537 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006540 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006543 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006548 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006551 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006553 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006556 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006559 2571 flags.go:64] FLAG: --enable-server="true" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006562 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006570 2571 flags.go:64] FLAG: --event-burst="100" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006573 2571 flags.go:64] FLAG: --event-qps="50" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006576 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006579 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006583 2571 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006587 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006590 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006593 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006596 2571 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006599 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006602 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006605 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006608 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006611 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006614 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006616 2571 flags.go:64] FLAG: --feature-gates="" Apr 22 18:36:02.009363 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006620 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006623 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006626 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006635 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006639 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006642 2571 flags.go:64] FLAG: --help="false" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006644 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006648 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006651 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006655 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006658 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006662 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006665 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006667 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006670 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006673 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006676 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006679 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006682 2571 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006685 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006690 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006693 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006696 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006699 2571 flags.go:64] FLAG: --lock-file="" Apr 22 18:36:02.009955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006702 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006705 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006708 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006713 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006716 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006719 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006722 2571 flags.go:64] FLAG: --logging-format="text" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006725 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006728 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006731 2571 flags.go:64] FLAG: --manifest-url="" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006734 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006738 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006747 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006751 2571 flags.go:64] FLAG: --max-pods="110" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006754 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006758 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006761 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006764 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006767 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006770 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006773 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006781 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006784 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006788 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:36:02.010556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006791 2571 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006794 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006800 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006803 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006806 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006825 2571 flags.go:64] FLAG: --port="10250" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006829 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006832 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c1c96526dd93f082" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006836 2571 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006839 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006842 2571 flags.go:64] FLAG: --register-node="true" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006845 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006848 2571 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006852 2571 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006855 2571 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006858 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006860 2571 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006864 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006867 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006870 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006873 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006882 2571 flags.go:64] FLAG: --runonce="false" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006885 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006888 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006891 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:36:02.011165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006894 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006897 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006900 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006903 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006906 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006909 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006912 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006915 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006918 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006921 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006924 2571 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006928 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006933 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006936 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006939 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006946 2571 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006949 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006952 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006955 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006958 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006961 2571 flags.go:64] FLAG: --v="2" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006965 2571 flags.go:64] FLAG: --version="false" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006970 2571 flags.go:64] FLAG: --vmodule="" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006974 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.006978 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:36:02.011764 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007078 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007081 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007099 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007102 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007108 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007114 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007117 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007120 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007123 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007127 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007130 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007133 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007136 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007138 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007141 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007144 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007146 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007150 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007154 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007156 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:02.012438 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007159 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007162 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007164 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007167 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007170 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007172 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007175 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007177 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007180 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007183 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007185 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007188 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007190 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007192 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007195 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007197 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007200 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007204 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007207 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007209 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:02.012985 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007212 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007215 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007217 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007219 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007222 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007224 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007228 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007232 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007235 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007237 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007240 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007243 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007245 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007248 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007250 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007253 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007255 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007258 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007260 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:02.013603 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007263 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007265 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007268 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007271 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007273 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007276 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007278 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007281 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007283 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007286 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007290 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007292 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007295 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007298 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007301 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007303 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007306 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007308 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007311 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007313 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:02.014145 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007316 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:02.014705 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007318 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:02.014705 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007322 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:02.014705 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007325 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:02.014705 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007328 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:02.014705 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007330 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:02.014705 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.007333 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:02.014705 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.008059 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:02.015005 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.014983 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:36:02.015041 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.015007 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:36:02.015076 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015061 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:02.015076 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015067 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:02.015076 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015069 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:02.015076 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015073 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:02.015076 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015076 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015079 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015082 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015100 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015104 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015107 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015109 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015112 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015114 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015117 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015121 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015126 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015129 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015132 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015135 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015137 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015141 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015144 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015147 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:02.015241 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015149 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015152 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015154 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015157 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015159 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015162 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015164 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015168 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015177 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015180 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015183 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015185 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015188 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015190 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015193 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015196 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015199 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015201 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015205 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015209 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:02.015777 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015212 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015214 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015217 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015219 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015222 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015224 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015227 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015229 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015232 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015235 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015237 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015240 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015242 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015245 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015248 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015250 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015253 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015255 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015258 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015261 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:02.016363 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015264 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015273 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015276 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015279 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015281 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015284 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015286 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015289 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015291 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015294 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015297 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015299 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015302 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015304 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015307 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015309 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015312 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015314 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015317 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015319 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:02.016907 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015322 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015325 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015327 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.015332 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015456 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015461 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015464 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015467 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015470 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015472 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015475 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015478 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015481 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015483 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015491 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:36:02.017432 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015494 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015497 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015500 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015502 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015504 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015507 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015510 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015512 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015514 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015517 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015520 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015522 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015525 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015527 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015530 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015532 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015535 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015538 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015540 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:36:02.017813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015543 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015545 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015548 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015550 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015553 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015555 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015558 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015561 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015564 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015567 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015569 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015571 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015574 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015583 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015586 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015588 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015591 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015593 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015596 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:36:02.018286 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015598 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015601 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015603 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015606 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015608 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015610 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015613 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015615 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015617 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015620 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015623 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015625 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015628 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015630 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015633 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015635 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015637 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015640 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015642 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015645 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:36:02.018751 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015647 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015650 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015653 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015655 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015658 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015661 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015665 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015673 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015676 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015679 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015682 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015685 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015688 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015691 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015693 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015706 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:36:02.019247 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:02.015709 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:36:02.019631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.015714 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:36:02.019631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.016467 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:36:02.019631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.018447 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:36:02.019631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.019370 2571 server.go:1019] "Starting client certificate rotation" Apr 22 18:36:02.019631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.019467 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:02.019631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.019510 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:36:02.043554 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.043527 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:02.046473 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.046453 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:36:02.063947 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.063927 2571 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:36:02.069072 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.069055 2571 log.go:25] "Validated CRI v1 image API" Apr 22 18:36:02.070330 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.070313 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:36:02.072525 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.072509 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:02.074237 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.074216 2571 fs.go:135] Filesystem UUIDs: map[51b7d919-a6dc-4cde-a2d3-d7060082aa1f:/dev/nvme0n1p4 68316c74-7e3c-421c-a48d-0cb4a5888b41:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 22 18:36:02.074292 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.074238 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:36:02.080164 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.079926 2571 manager.go:217] Machine: {Timestamp:2026-04-22 18:36:02.077938443 +0000 UTC m=+0.399706832 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099740 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d9d2ed26daa722edd254cbda3379a SystemUUID:ec2d9d2e-d26d-aa72-2edd-254cbda3379a BootID:65c97d46-3efe-4d41-b841-6786a127f02b Filesystems:[{Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6c:23:e8:2a:31 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6c:23:e8:2a:31 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:7e:06:e0:df:d2:79 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:36:02.080164 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.080161 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:36:02.080286 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.080273 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:36:02.081266 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.081240 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:36:02.081413 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.081269 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-85.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:36:02.081453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.081422 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:36:02.081453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.081431 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:36:02.081453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.081445 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:02.082147 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.082137 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:36:02.083441 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.083432 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:02.083559 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.083550 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:36:02.085846 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.085837 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:36:02.085877 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.085854 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:36:02.085877 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.085866 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:36:02.085877 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.085876 2571 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:36:02.085978 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.085885 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:36:02.087003 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.086991 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:02.087046 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.087012 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:36:02.090423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.090400 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:36:02.091853 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.091838 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:36:02.094513 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094493 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:36:02.094584 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094529 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:36:02.094584 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094543 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:36:02.094584 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094555 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:36:02.094584 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094563 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:36:02.094584 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094582 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:36:02.094718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094589 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:36:02.094718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094595 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:36:02.094718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094601 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:36:02.094718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094607 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:36:02.094718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094621 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:36:02.094718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.094632 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:36:02.095589 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.095575 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:36:02.095630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.095592 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:36:02.095738 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.095720 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v56gd" Apr 22 18:36:02.097157 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.097020 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:36:02.097157 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.097075 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-85.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:36:02.099949 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.099932 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:36:02.100043 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.100027 2571 server.go:1295] "Started kubelet" Apr 22 18:36:02.100187 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.100139 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:36:02.100254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.100159 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:36:02.100254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.100231 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:36:02.100985 ip-10-0-131-85 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:36:02.101154 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.101140 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:36:02.101218 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.101185 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-v56gd" Apr 22 18:36:02.104184 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.104168 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:36:02.108687 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.108669 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 18:36:02.109208 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.109193 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:02.109746 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.109728 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:36:02.110543 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110526 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:36:02.110543 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110527 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:36:02.110663 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110552 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:36:02.110722 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.110663 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.110722 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110669 2571 factory.go:55] Registering systemd factory Apr 22 18:36:02.110722 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110699 2571 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:36:02.110722 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110719 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:36:02.110879 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110727 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:36:02.110930 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110911 2571 factory.go:153] Registering CRI-O factory Apr 22 18:36:02.110930 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110924 2571 factory.go:223] Registration of the crio container factory successfully Apr 22 18:36:02.111018 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.110977 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:36:02.111018 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.111003 2571 factory.go:103] Registering Raw factory Apr 22 18:36:02.111135 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.111029 2571 manager.go:1196] Started watching for new ooms in manager Apr 22 18:36:02.112261 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.111552 2571 manager.go:319] Starting recovery of all containers Apr 22 18:36:02.112748 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.112726 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:02.117976 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.117951 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-85.ec2.internal\" not found" node="ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.118124 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.118004 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-85.ec2.internal" not found Apr 22 18:36:02.122264 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.122251 2571 manager.go:324] Recovery completed Apr 22 18:36:02.127728 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.127713 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:02.130176 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.130159 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:02.130238 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.130189 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:02.130238 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.130200 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:02.130646 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.130628 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:36:02.130646 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.130645 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:36:02.130748 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.130666 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:36:02.132874 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.132860 2571 policy_none.go:49] "None policy: Start" Apr 22 18:36:02.132938 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.132883 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:36:02.132938 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.132893 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:36:02.135875 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.135858 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-85.ec2.internal" not found Apr 22 18:36:02.171679 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.171663 2571 manager.go:341] "Starting Device Plugin manager" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.171724 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.171735 2571 server.go:85] "Starting device plugin registration server" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.171959 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.171971 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.172061 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.172181 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.172189 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.172679 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:36:02.180001 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.172723 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.195775 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.195759 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-131-85.ec2.internal" not found Apr 22 18:36:02.248255 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.248175 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:36:02.249491 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.249465 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:36:02.249491 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.249495 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:36:02.249650 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.249515 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:36:02.249650 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.249524 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:36:02.249650 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.249564 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:36:02.254292 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.254273 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:02.272639 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.272620 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:02.273999 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.273980 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:02.274080 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.274008 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:02.274080 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.274025 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:02.274080 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.274047 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.283007 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.282987 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.283117 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.283011 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-85.ec2.internal\": node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.298051 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.298028 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.349662 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.349636 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal"] Apr 22 18:36:02.349739 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.349702 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:02.350846 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.350829 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:02.350940 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.350863 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:02.350940 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.350877 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:02.352260 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.352245 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:02.352408 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.352393 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.352459 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.352422 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:02.353951 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.353931 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:02.354036 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.353932 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:02.354036 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.353962 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:02.354036 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.353972 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:02.354036 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.353982 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:02.354036 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.353996 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:02.355256 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.355239 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.355340 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.355262 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:36:02.355895 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.355878 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:36:02.355980 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.355911 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:36:02.355980 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.355935 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:36:02.380715 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.380701 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-85.ec2.internal\" not found" node="ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.385011 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.384996 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-85.ec2.internal\" not found" node="ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.398127 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.398103 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.412321 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.412295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f0c92d069d6b1142c54197b5e850203-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal\" (UID: \"9f0c92d069d6b1142c54197b5e850203\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.498735 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.498663 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.513011 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.512986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f0c92d069d6b1142c54197b5e850203-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal\" (UID: \"9f0c92d069d6b1142c54197b5e850203\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.513115 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.513023 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f0c92d069d6b1142c54197b5e850203-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal\" (UID: \"9f0c92d069d6b1142c54197b5e850203\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.513115 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.513049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cfa5d89adc8adc45f4b9cd558035eee4-config\") pod \"kube-apiserver-proxy-ip-10-0-131-85.ec2.internal\" (UID: \"cfa5d89adc8adc45f4b9cd558035eee4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.513115 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.513076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9f0c92d069d6b1142c54197b5e850203-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal\" (UID: \"9f0c92d069d6b1142c54197b5e850203\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.599388 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.599363 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.613760 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.613734 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cfa5d89adc8adc45f4b9cd558035eee4-config\") pod \"kube-apiserver-proxy-ip-10-0-131-85.ec2.internal\" (UID: \"cfa5d89adc8adc45f4b9cd558035eee4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.613845 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.613771 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f0c92d069d6b1142c54197b5e850203-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal\" (UID: \"9f0c92d069d6b1142c54197b5e850203\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.613845 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.613805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9f0c92d069d6b1142c54197b5e850203-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal\" (UID: \"9f0c92d069d6b1142c54197b5e850203\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.613845 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.613814 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/cfa5d89adc8adc45f4b9cd558035eee4-config\") pod \"kube-apiserver-proxy-ip-10-0-131-85.ec2.internal\" (UID: \"cfa5d89adc8adc45f4b9cd558035eee4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.682953 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.682912 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.687550 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.687531 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" Apr 22 18:36:02.700228 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.700209 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.800801 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.800775 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.901287 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:02.901255 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-85.ec2.internal\" not found" Apr 22 18:36:02.957373 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:02.957351 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:03.010541 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.010520 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:03.019219 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.019199 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:36:03.019378 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.019352 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:03.019378 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.019371 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:03.019473 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.019366 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:36:03.019473 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.019400 2571 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ab529ce1a2d38443a8d8fd8296197706-bf87a9218e9e9494.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.131.85:40390->13.216.108.216:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" Apr 22 18:36:03.019473 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.019426 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" Apr 22 18:36:03.036503 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.036483 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:36:03.086200 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.086154 2571 apiserver.go:52] "Watching apiserver" Apr 22 18:36:03.097868 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.097850 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:36:03.100337 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.100316 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vjdzq","openshift-network-diagnostics/network-check-target-lllsm","openshift-network-operator/iptables-alerter-qgkvh","openshift-ovn-kubernetes/ovnkube-node-rx4st","kube-system/konnectivity-agent-hcck9","kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2","openshift-cluster-node-tuning-operator/tuned-wknrq","openshift-image-registry/node-ca-djp9s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal","openshift-multus/multus-additional-cni-plugins-4dcm7","openshift-multus/multus-f8jk4","openshift-dns/node-resolver-gxtfk"] Apr 22 18:36:03.102228 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.102205 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.103461 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.103435 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:03.103585 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.103515 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:03.105058 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.105020 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:36:03.105433 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.105413 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:36:03.105558 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.105479 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lx5wx\"" Apr 22 18:36:03.105675 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.105655 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:36:03.105876 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.105850 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:31:02 +0000 UTC" deadline="2028-01-19 09:51:36.460612016 +0000 UTC" Apr 22 18:36:03.105970 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.105882 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15279h15m33.354738415s" Apr 22 18:36:03.107771 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.107755 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.107897 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.107864 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.108880 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.108853 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.109403 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.109363 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:36:03.109925 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.109908 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:36:03.110148 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.110129 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-hbmj6\"" Apr 22 18:36:03.110231 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.110175 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:03.110295 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.110231 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:36:03.110348 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.110298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:36:03.110348 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.110229 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:03.110446 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.110378 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:36:03.110446 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.110420 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-znz9g\"" Apr 22 18:36:03.110539 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.110464 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:03.111250 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.111233 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:36:03.111341 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.111276 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:36:03.111394 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.111370 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:03.111646 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.111627 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:36:03.111735 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.111667 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:36:03.112205 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.112193 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:36:03.112318 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.112298 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-ptg64\"" Apr 22 18:36:03.113097 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.112939 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.113097 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.113034 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.114335 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.114320 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.115238 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.115221 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:36:03.115329 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.115308 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:36:03.115781 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.115733 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:36:03.115904 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.115887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.115965 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.115929 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:36:03.116018 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.115985 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-789rh\"" Apr 22 18:36:03.116132 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116101 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:36:03.116309 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-run-netns\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.116439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.116439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116349 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.116439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116391 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-device-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.116439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116415 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-57dw5\"" Apr 22 18:36:03.116439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116420 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-sys-fs\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116446 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-lib-modules\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116481 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b04ddd7d-9290-4198-9b08-9617306b7172-etc-tuned\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116497 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-slash\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116520 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-var-lib-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116547 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-env-overrides\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116568 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-ovnkube-script-lib\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116591 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grqz7\" (UniqueName: \"kubernetes.io/projected/3a7e426e-76c8-4dc9-9425-11528ef0197e-kube-api-access-grqz7\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116653 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbqn\" (UniqueName: \"kubernetes.io/projected/f01e5c64-eadd-49f5-a2f4-4953111daa69-kube-api-access-ssbqn\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116687 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-kubernetes\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116728 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-var-lib-kubelet\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116753 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116753 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmfrc\" (UniqueName: \"kubernetes.io/projected/b04ddd7d-9290-4198-9b08-9617306b7172-kube-api-access-mmfrc\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116784 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116815 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:36:03.116871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfqd9\" (UniqueName: \"kubernetes.io/projected/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-kube-api-access-rfqd9\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116940 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116976 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-jcfwz\"" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117017 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117051 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117050 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.116972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysctl-d\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-systemd\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-systemd-units\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117273 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-systemd\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-node-log\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117305 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-cni-bin\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-log-socket\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117348 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-cni-netd\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blmvs\" (UniqueName: \"kubernetes.io/projected/4d26e750-5c11-4023-9012-7dd824eeda4f-kube-api-access-blmvs\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117386 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34348a50-b524-4eb0-80d8-2866eaf0b1aa-konnectivity-ca\") pod \"konnectivity-agent-hcck9\" (UID: \"34348a50-b524-4eb0-80d8-2866eaf0b1aa\") " pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117410 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-socket-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117433 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-registration-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.117799 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysctl-conf\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117474 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-run\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-host-slash\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117544 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-kubelet\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-ovnkube-config\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117603 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b04ddd7d-9290-4198-9b08-9617306b7172-tmp\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117629 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-ovn\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117652 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d26e750-5c11-4023-9012-7dd824eeda4f-ovn-node-metrics-cert\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117698 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-iptables-alerter-script\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysconfig\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-sys\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117844 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-etc-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117928 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34348a50-b524-4eb0-80d8-2866eaf0b1aa-agent-certs\") pod \"konnectivity-agent-hcck9\" (UID: \"34348a50-b524-4eb0-80d8-2866eaf0b1aa\") " pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117956 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-modprobe-d\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.117979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-host\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.118522 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-6gsp2\"" Apr 22 18:36:03.118798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.118804 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:36:03.119733 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.119358 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-8k8nq\"" Apr 22 18:36:03.119733 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.119510 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:36:03.119733 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.119667 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:36:03.123181 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.123164 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:36:03.149136 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.149111 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-8jsnq" Apr 22 18:36:03.158643 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.158621 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-8jsnq" Apr 22 18:36:03.206759 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.206722 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa5d89adc8adc45f4b9cd558035eee4.slice/crio-0e35a84d398c95d36c338f0a912aa2e28934e79ddb62b7db28bb032a35791913 WatchSource:0}: Error finding container 0e35a84d398c95d36c338f0a912aa2e28934e79ddb62b7db28bb032a35791913: Status 404 returned error can't find the container with id 0e35a84d398c95d36c338f0a912aa2e28934e79ddb62b7db28bb032a35791913 Apr 22 18:36:03.207100 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.207073 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0c92d069d6b1142c54197b5e850203.slice/crio-802d31c0c5f5fcca5b52e2cbc8b58fe38a72997021d1764ea50f3f38e7c41c42 WatchSource:0}: Error finding container 802d31c0c5f5fcca5b52e2cbc8b58fe38a72997021d1764ea50f3f38e7c41c42: Status 404 returned error can't find the container with id 802d31c0c5f5fcca5b52e2cbc8b58fe38a72997021d1764ea50f3f38e7c41c42 Apr 22 18:36:03.211194 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.211174 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:36:03.211357 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.211341 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:36:03.218431 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-socket-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.218525 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218444 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-run\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.218525 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218470 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-host-slash\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.218631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218532 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-socket-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.218631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218544 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-run\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.218631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-host-slash\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.218631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-kubelet\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.218790 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-socket-dir-parent\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.218790 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-cni-bin\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.218790 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-kubelet\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.218790 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218713 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zxt7\" (UniqueName: \"kubernetes.io/projected/4717a966-da61-4170-b33d-9c683e74d3aa-kube-api-access-4zxt7\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.218790 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-ovn\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.218973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218786 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d26e750-5c11-4023-9012-7dd824eeda4f-ovn-node-metrics-cert\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.218973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218827 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-ovn\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.218973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218831 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.218973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218879 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-hostroot\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.218973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218905 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f75b87a2-8899-4b74-9e48-0ca63be22b47-tmp-dir\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.218973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218936 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysconfig\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.218979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219002 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysconfig\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219045 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-sys\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219057 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219100 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-etc-selinux\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-etc-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219142 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-modprobe-d\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-etc-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219078 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-sys\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-run-netns\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219221 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219270 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-modprobe-d\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219280 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219281 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-os-release\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219275 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219255 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-run-netns\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219328 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-os-release\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219383 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-kubelet\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-device-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b04ddd7d-9290-4198-9b08-9617306b7172-etc-tuned\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219469 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-slash\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219503 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-env-overrides\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219501 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-device-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219533 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219581 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-slash\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219579 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-system-cni-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.219714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219641 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-daemon-config\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219686 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-grqz7\" (UniqueName: \"kubernetes.io/projected/3a7e426e-76c8-4dc9-9425-11528ef0197e-kube-api-access-grqz7\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219727 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbqn\" (UniqueName: \"kubernetes.io/projected/f01e5c64-eadd-49f5-a2f4-4953111daa69-kube-api-access-ssbqn\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219798 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmfrc\" (UniqueName: \"kubernetes.io/projected/b04ddd7d-9290-4198-9b08-9617306b7172-kube-api-access-mmfrc\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-systemd-units\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219895 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-cni-bin\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219922 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-systemd-units\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219929 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-etc-kubernetes\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219968 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.219999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-cni-bin\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-env-overrides\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.220152 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220081 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysctl-d\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.220227 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:03.72019764 +0000 UTC m=+2.041966046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220248 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-node-log\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220254 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysctl-d\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-log-socket\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.220486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220311 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-node-log\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220321 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-system-cni-dir\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220336 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-log-socket\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220347 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-cnibin\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220387 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-cni-multus\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220411 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4717a966-da61-4170-b33d-9c683e74d3aa-serviceca\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220440 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-registration-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysctl-conf\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220501 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n68kb\" (UniqueName: \"kubernetes.io/projected/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-kube-api-access-n68kb\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220534 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-registration-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220610 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-sysctl-conf\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220581 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-k8s-cni-cncf-io\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220667 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-netns\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220692 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-conf-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220716 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b04ddd7d-9290-4198-9b08-9617306b7172-tmp\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-iptables-alerter-script\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.221294 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220799 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34348a50-b524-4eb0-80d8-2866eaf0b1aa-agent-certs\") pod \"konnectivity-agent-hcck9\" (UID: \"34348a50-b524-4eb0-80d8-2866eaf0b1aa\") " pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f75b87a2-8899-4b74-9e48-0ca63be22b47-hosts-file\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220872 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-host\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220898 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cnibin\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220921 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fr5x\" (UniqueName: \"kubernetes.io/projected/f9c02c13-ac97-4da1-8f21-e45794600da6-kube-api-access-9fr5x\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220949 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-sys-fs\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.220978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-lib-modules\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221007 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-var-lib-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-ovnkube-script-lib\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blmvs\" (UniqueName: \"kubernetes.io/projected/4d26e750-5c11-4023-9012-7dd824eeda4f-kube-api-access-blmvs\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-multus-certs\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4xhb\" (UniqueName: \"kubernetes.io/projected/f75b87a2-8899-4b74-9e48-0ca63be22b47-kube-api-access-g4xhb\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221225 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-sys-fs\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-lib-modules\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221369 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-var-lib-openvswitch\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221632 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221631 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-iptables-alerter-script\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.222074 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4717a966-da61-4170-b33d-9c683e74d3aa-host\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221728 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-host\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221877 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-ovnkube-script-lib\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221917 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221958 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.221989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-kubernetes\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-var-lib-kubelet\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222038 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222045 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a7e426e-76c8-4dc9-9425-11528ef0197e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-kubernetes\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222166 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/4d26e750-5c11-4023-9012-7dd824eeda4f-ovn-node-metrics-cert\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222183 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-var-lib-kubelet\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rfqd9\" (UniqueName: \"kubernetes.io/projected/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-kube-api-access-rfqd9\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-ovnkube-config\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222256 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9c02c13-ac97-4da1-8f21-e45794600da6-cni-binary-copy\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222282 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-systemd\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-systemd\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.222916 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222362 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-cni-netd\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b04ddd7d-9290-4198-9b08-9617306b7172-etc-systemd\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222392 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34348a50-b524-4eb0-80d8-2866eaf0b1aa-konnectivity-ca\") pod \"konnectivity-agent-hcck9\" (UID: \"34348a50-b524-4eb0-80d8-2866eaf0b1aa\") " pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222416 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-run-systemd\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222459 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/4d26e750-5c11-4023-9012-7dd824eeda4f-host-cni-netd\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222500 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b04ddd7d-9290-4198-9b08-9617306b7172-etc-tuned\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222569 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-cni-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222840 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/4d26e750-5c11-4023-9012-7dd824eeda4f-ovnkube-config\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.223658 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.222941 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/34348a50-b524-4eb0-80d8-2866eaf0b1aa-konnectivity-ca\") pod \"konnectivity-agent-hcck9\" (UID: \"34348a50-b524-4eb0-80d8-2866eaf0b1aa\") " pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.224148 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.224120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b04ddd7d-9290-4198-9b08-9617306b7172-tmp\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.224868 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.224848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/34348a50-b524-4eb0-80d8-2866eaf0b1aa-agent-certs\") pod \"konnectivity-agent-hcck9\" (UID: \"34348a50-b524-4eb0-80d8-2866eaf0b1aa\") " pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.227721 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.227699 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:03.227855 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.227840 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:03.227933 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.227925 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfk94 for pod openshift-network-diagnostics/network-check-target-lllsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:03.228137 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.228123 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94 podName:42281d48-d81c-48a1-ac06-753e55e4ae05 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:03.728101366 +0000 UTC m=+2.049869759 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mfk94" (UniqueName: "kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94") pod "network-check-target-lllsm" (UID: "42281d48-d81c-48a1-ac06-753e55e4ae05") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:03.228391 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.228367 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-grqz7\" (UniqueName: \"kubernetes.io/projected/3a7e426e-76c8-4dc9-9425-11528ef0197e-kube-api-access-grqz7\") pod \"aws-ebs-csi-driver-node-4l7w2\" (UID: \"3a7e426e-76c8-4dc9-9425-11528ef0197e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.228560 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.228537 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbqn\" (UniqueName: \"kubernetes.io/projected/f01e5c64-eadd-49f5-a2f4-4953111daa69-kube-api-access-ssbqn\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:03.230136 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.230112 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmfrc\" (UniqueName: \"kubernetes.io/projected/b04ddd7d-9290-4198-9b08-9617306b7172-kube-api-access-mmfrc\") pod \"tuned-wknrq\" (UID: \"b04ddd7d-9290-4198-9b08-9617306b7172\") " pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.230237 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.230196 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blmvs\" (UniqueName: \"kubernetes.io/projected/4d26e750-5c11-4023-9012-7dd824eeda4f-kube-api-access-blmvs\") pod \"ovnkube-node-rx4st\" (UID: \"4d26e750-5c11-4023-9012-7dd824eeda4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.232352 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.232329 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfqd9\" (UniqueName: \"kubernetes.io/projected/84ddb6a8-3bdd-4890-a3d5-b36eeb73829a-kube-api-access-rfqd9\") pod \"iptables-alerter-qgkvh\" (UID: \"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a\") " pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.252220 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.252173 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" event={"ID":"cfa5d89adc8adc45f4b9cd558035eee4","Type":"ContainerStarted","Data":"0e35a84d398c95d36c338f0a912aa2e28934e79ddb62b7db28bb032a35791913"} Apr 22 18:36:03.253179 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.253149 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" event={"ID":"9f0c92d069d6b1142c54197b5e850203","Type":"ContainerStarted","Data":"802d31c0c5f5fcca5b52e2cbc8b58fe38a72997021d1764ea50f3f38e7c41c42"} Apr 22 18:36:03.322855 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.322829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-socket-dir-parent\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.322960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.322861 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-cni-bin\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.322960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.322879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4zxt7\" (UniqueName: \"kubernetes.io/projected/4717a966-da61-4170-b33d-9c683e74d3aa-kube-api-access-4zxt7\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.322960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.322907 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.322960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.322945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-socket-dir-parent\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323122 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.322950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-cni-bin\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323122 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-hostroot\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323122 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323050 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f75b87a2-8899-4b74-9e48-0ca63be22b47-tmp-dir\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.323122 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323100 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-os-release\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323122 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323052 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-hostroot\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323285 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323127 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-os-release\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323285 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323191 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-os-release\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323285 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323194 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-os-release\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323285 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-kubelet\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323285 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323264 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-system-cni-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323305 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-kubelet\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323315 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-daemon-config\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f75b87a2-8899-4b74-9e48-0ca63be22b47-tmp-dir\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323346 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-etc-kubernetes\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-system-cni-dir\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-system-cni-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323418 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-etc-kubernetes\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323435 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-cnibin\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323447 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-system-cni-dir\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323477 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323478 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-cni-multus\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323503 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323513 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-var-lib-cni-multus\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323504 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-cnibin\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323516 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4717a966-da61-4170-b33d-9c683e74d3aa-serviceca\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n68kb\" (UniqueName: \"kubernetes.io/projected/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-kube-api-access-n68kb\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-k8s-cni-cncf-io\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-netns\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-netns\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323673 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-k8s-cni-cncf-io\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323697 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-conf-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323729 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f75b87a2-8899-4b74-9e48-0ca63be22b47-hosts-file\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cnibin\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-conf-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fr5x\" (UniqueName: \"kubernetes.io/projected/f9c02c13-ac97-4da1-8f21-e45794600da6-kube-api-access-9fr5x\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323789 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-daemon-config\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323767 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323818 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-multus-certs\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.323958 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323825 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cnibin\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4xhb\" (UniqueName: \"kubernetes.io/projected/f75b87a2-8899-4b74-9e48-0ca63be22b47-kube-api-access-g4xhb\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323890 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-host-run-multus-certs\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4717a966-da61-4170-b33d-9c683e74d3aa-host\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323934 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/4717a966-da61-4170-b33d-9c683e74d3aa-serviceca\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323937 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323952 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f75b87a2-8899-4b74-9e48-0ca63be22b47-hosts-file\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.323987 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4717a966-da61-4170-b33d-9c683e74d3aa-host\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.324006 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9c02c13-ac97-4da1-8f21-e45794600da6-cni-binary-copy\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.324040 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.324066 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-cni-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.324149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f9c02c13-ac97-4da1-8f21-e45794600da6-multus-cni-dir\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.324185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.324271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-cni-binary-copy\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.324685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.324507 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f9c02c13-ac97-4da1-8f21-e45794600da6-cni-binary-copy\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.331904 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.331874 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4xhb\" (UniqueName: \"kubernetes.io/projected/f75b87a2-8899-4b74-9e48-0ca63be22b47-kube-api-access-g4xhb\") pod \"node-resolver-gxtfk\" (UID: \"f75b87a2-8899-4b74-9e48-0ca63be22b47\") " pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.331904 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.331883 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fr5x\" (UniqueName: \"kubernetes.io/projected/f9c02c13-ac97-4da1-8f21-e45794600da6-kube-api-access-9fr5x\") pod \"multus-f8jk4\" (UID: \"f9c02c13-ac97-4da1-8f21-e45794600da6\") " pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.332451 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.332430 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n68kb\" (UniqueName: \"kubernetes.io/projected/a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79-kube-api-access-n68kb\") pod \"multus-additional-cni-plugins-4dcm7\" (UID: \"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79\") " pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.332507 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.332461 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zxt7\" (UniqueName: \"kubernetes.io/projected/4717a966-da61-4170-b33d-9c683e74d3aa-kube-api-access-4zxt7\") pod \"node-ca-djp9s\" (UID: \"4717a966-da61-4170-b33d-9c683e74d3aa\") " pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.430621 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.430591 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" Apr 22 18:36:03.436309 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.436288 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a7e426e_76c8_4dc9_9425_11528ef0197e.slice/crio-14d8a2949196ac4199c42a1249836727b82b14a0f6263adb95acb253fd9f6197 WatchSource:0}: Error finding container 14d8a2949196ac4199c42a1249836727b82b14a0f6263adb95acb253fd9f6197: Status 404 returned error can't find the container with id 14d8a2949196ac4199c42a1249836727b82b14a0f6263adb95acb253fd9f6197 Apr 22 18:36:03.451911 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.451889 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qgkvh" Apr 22 18:36:03.455558 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.455534 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:03.460582 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.460562 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84ddb6a8_3bdd_4890_a3d5_b36eeb73829a.slice/crio-edd4a219bd413e9cf9b41c8eadf952980c894558a6f93b913fc27f9cbdda6659 WatchSource:0}: Error finding container edd4a219bd413e9cf9b41c8eadf952980c894558a6f93b913fc27f9cbdda6659: Status 404 returned error can't find the container with id edd4a219bd413e9cf9b41c8eadf952980c894558a6f93b913fc27f9cbdda6659 Apr 22 18:36:03.461132 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.461110 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d26e750_5c11_4023_9012_7dd824eeda4f.slice/crio-87c73eddd9d24b1c33e9c75e8391a57c3815ea2b1f0006a2bb81a70b613e00bb WatchSource:0}: Error finding container 87c73eddd9d24b1c33e9c75e8391a57c3815ea2b1f0006a2bb81a70b613e00bb: Status 404 returned error can't find the container with id 87c73eddd9d24b1c33e9c75e8391a57c3815ea2b1f0006a2bb81a70b613e00bb Apr 22 18:36:03.471760 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.471744 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:03.477943 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.477923 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34348a50_b524_4eb0_80d8_2866eaf0b1aa.slice/crio-e191043f10fedc756a27d30398e68fc7ffc0c9ab1aabc754412c45cd60f52b4d WatchSource:0}: Error finding container e191043f10fedc756a27d30398e68fc7ffc0c9ab1aabc754412c45cd60f52b4d: Status 404 returned error can't find the container with id e191043f10fedc756a27d30398e68fc7ffc0c9ab1aabc754412c45cd60f52b4d Apr 22 18:36:03.490312 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.490293 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-wknrq" Apr 22 18:36:03.495655 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.495632 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04ddd7d_9290_4198_9b08_9617306b7172.slice/crio-b29dee1640b14c813bbfb9220df14cf384bcb1b65d69f257567752493fed5954 WatchSource:0}: Error finding container b29dee1640b14c813bbfb9220df14cf384bcb1b65d69f257567752493fed5954: Status 404 returned error can't find the container with id b29dee1640b14c813bbfb9220df14cf384bcb1b65d69f257567752493fed5954 Apr 22 18:36:03.505565 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.505547 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:03.516239 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.516221 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-djp9s" Apr 22 18:36:03.521813 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.521794 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4717a966_da61_4170_b33d_9c683e74d3aa.slice/crio-b505ce4850a25162d99cae323bb624c9b5a57ff7d82258a121ab51c15074a13d WatchSource:0}: Error finding container b505ce4850a25162d99cae323bb624c9b5a57ff7d82258a121ab51c15074a13d: Status 404 returned error can't find the container with id b505ce4850a25162d99cae323bb624c9b5a57ff7d82258a121ab51c15074a13d Apr 22 18:36:03.525608 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.525592 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" Apr 22 18:36:03.531413 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.531392 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4a9c2e5_46e7_46c2_8cba_ab6a8dce5c79.slice/crio-a036b591a4328e29f3915ab738b628c80e30f5380e91dedacc2ae92718f25379 WatchSource:0}: Error finding container a036b591a4328e29f3915ab738b628c80e30f5380e91dedacc2ae92718f25379: Status 404 returned error can't find the container with id a036b591a4328e29f3915ab738b628c80e30f5380e91dedacc2ae92718f25379 Apr 22 18:36:03.534498 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.534481 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-f8jk4" Apr 22 18:36:03.538771 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.538753 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gxtfk" Apr 22 18:36:03.540982 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.540959 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9c02c13_ac97_4da1_8f21_e45794600da6.slice/crio-196e0b4988e080491f1c3af0ce80a0f59f42091b9194bb5eae36d9b691db1f5d WatchSource:0}: Error finding container 196e0b4988e080491f1c3af0ce80a0f59f42091b9194bb5eae36d9b691db1f5d: Status 404 returned error can't find the container with id 196e0b4988e080491f1c3af0ce80a0f59f42091b9194bb5eae36d9b691db1f5d Apr 22 18:36:03.545062 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:03.545038 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf75b87a2_8899_4b74_9e48_0ca63be22b47.slice/crio-f4f1e3a56f8e2e8761012f4667c13acc5833c6e88a460fd9f8b108b5555ac2f2 WatchSource:0}: Error finding container f4f1e3a56f8e2e8761012f4667c13acc5833c6e88a460fd9f8b108b5555ac2f2: Status 404 returned error can't find the container with id f4f1e3a56f8e2e8761012f4667c13acc5833c6e88a460fd9f8b108b5555ac2f2 Apr 22 18:36:03.727897 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.727788 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:03.728049 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.727943 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:03.728049 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.728004 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:04.727985923 +0000 UTC m=+3.049754301 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:03.828708 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.828671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:03.828878 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.828823 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:03.828878 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.828872 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:03.828991 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.828884 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfk94 for pod openshift-network-diagnostics/network-check-target-lllsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:03.828991 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:03.828940 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94 podName:42281d48-d81c-48a1-ac06-753e55e4ae05 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:04.828921465 +0000 UTC m=+3.150689856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfk94" (UniqueName: "kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94") pod "network-check-target-lllsm" (UID: "42281d48-d81c-48a1-ac06-753e55e4ae05") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:03.996282 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:03.996181 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:04.160203 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.159765 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:03 +0000 UTC" deadline="2027-12-10 06:05:03.945431615 +0000 UTC" Apr 22 18:36:04.160203 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.159815 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14315h28m59.785620697s" Apr 22 18:36:04.229060 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.229031 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:04.270136 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.270027 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f8jk4" event={"ID":"f9c02c13-ac97-4da1-8f21-e45794600da6","Type":"ContainerStarted","Data":"196e0b4988e080491f1c3af0ce80a0f59f42091b9194bb5eae36d9b691db1f5d"} Apr 22 18:36:04.287833 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.287794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerStarted","Data":"a036b591a4328e29f3915ab738b628c80e30f5380e91dedacc2ae92718f25379"} Apr 22 18:36:04.297686 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.297654 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-djp9s" event={"ID":"4717a966-da61-4170-b33d-9c683e74d3aa","Type":"ContainerStarted","Data":"b505ce4850a25162d99cae323bb624c9b5a57ff7d82258a121ab51c15074a13d"} Apr 22 18:36:04.315787 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.315749 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"87c73eddd9d24b1c33e9c75e8391a57c3815ea2b1f0006a2bb81a70b613e00bb"} Apr 22 18:36:04.332420 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.332344 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgkvh" event={"ID":"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a","Type":"ContainerStarted","Data":"edd4a219bd413e9cf9b41c8eadf952980c894558a6f93b913fc27f9cbdda6659"} Apr 22 18:36:04.345657 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.345615 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gxtfk" event={"ID":"f75b87a2-8899-4b74-9e48-0ca63be22b47","Type":"ContainerStarted","Data":"f4f1e3a56f8e2e8761012f4667c13acc5833c6e88a460fd9f8b108b5555ac2f2"} Apr 22 18:36:04.356291 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.356256 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wknrq" event={"ID":"b04ddd7d-9290-4198-9b08-9617306b7172","Type":"ContainerStarted","Data":"b29dee1640b14c813bbfb9220df14cf384bcb1b65d69f257567752493fed5954"} Apr 22 18:36:04.370653 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.370617 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hcck9" event={"ID":"34348a50-b524-4eb0-80d8-2866eaf0b1aa","Type":"ContainerStarted","Data":"e191043f10fedc756a27d30398e68fc7ffc0c9ab1aabc754412c45cd60f52b4d"} Apr 22 18:36:04.373152 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.372914 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" event={"ID":"3a7e426e-76c8-4dc9-9425-11528ef0197e","Type":"ContainerStarted","Data":"14d8a2949196ac4199c42a1249836727b82b14a0f6263adb95acb253fd9f6197"} Apr 22 18:36:04.738196 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.738065 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:04.738360 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:04.738286 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:04.738360 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:04.738358 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:06.73833565 +0000 UTC m=+5.060104030 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:04.839021 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.838983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:04.839248 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:04.839220 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:04.839248 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:04.839244 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:04.839363 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:04.839258 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfk94 for pod openshift-network-diagnostics/network-check-target-lllsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:04.839363 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:04.839319 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94 podName:42281d48-d81c-48a1-ac06-753e55e4ae05 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:06.839299273 +0000 UTC m=+5.161067654 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfk94" (UniqueName: "kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94") pod "network-check-target-lllsm" (UID: "42281d48-d81c-48a1-ac06-753e55e4ae05") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:04.928994 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:04.928960 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:36:05.160508 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.160461 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:31:03 +0000 UTC" deadline="2027-11-01 07:52:46.912684431 +0000 UTC" Apr 22 18:36:05.160508 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.160506 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13381h16m41.752182878s" Apr 22 18:36:05.250190 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.250142 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:05.250346 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:05.250260 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:05.250720 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.250695 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:05.250833 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:05.250802 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:05.909302 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.909269 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-n59sp"] Apr 22 18:36:05.916900 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.916880 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:05.917014 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:05.916955 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:05.948739 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.948708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4e7619c0-c870-4eac-9372-f539d9128cc8-dbus\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:05.948892 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.948761 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4e7619c0-c870-4eac-9372-f539d9128cc8-kubelet-config\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:05.948892 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:05.948792 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:06.050420 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.049603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:06.050420 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.049724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4e7619c0-c870-4eac-9372-f539d9128cc8-dbus\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:06.050420 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.049761 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4e7619c0-c870-4eac-9372-f539d9128cc8-kubelet-config\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:06.050420 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.049878 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4e7619c0-c870-4eac-9372-f539d9128cc8-kubelet-config\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:06.050420 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.049982 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:06.050420 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.050038 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret podName:4e7619c0-c870-4eac-9372-f539d9128cc8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:06.550017462 +0000 UTC m=+4.871785854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret") pod "global-pull-secret-syncer-n59sp" (UID: "4e7619c0-c870-4eac-9372-f539d9128cc8") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:06.050420 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.050368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4e7619c0-c870-4eac-9372-f539d9128cc8-dbus\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:06.554845 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.554809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:06.555337 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.555003 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:06.555337 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.555072 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret podName:4e7619c0-c870-4eac-9372-f539d9128cc8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:07.555052783 +0000 UTC m=+5.876821161 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret") pod "global-pull-secret-syncer-n59sp" (UID: "4e7619c0-c870-4eac-9372-f539d9128cc8") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:06.756083 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.756043 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:06.756254 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.756226 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:06.756328 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.756293 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:10.756274501 +0000 UTC m=+9.078042880 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:06.857467 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:06.857384 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:06.857611 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.857543 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:06.857611 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.857566 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:06.857611 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.857578 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfk94 for pod openshift-network-diagnostics/network-check-target-lllsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:06.857774 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:06.857641 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94 podName:42281d48-d81c-48a1-ac06-753e55e4ae05 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:10.857622457 +0000 UTC m=+9.179390834 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfk94" (UniqueName: "kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94") pod "network-check-target-lllsm" (UID: "42281d48-d81c-48a1-ac06-753e55e4ae05") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:07.250143 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:07.250040 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:07.250143 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:07.250101 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:07.250357 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:07.250183 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:07.250357 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:07.250327 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:07.563514 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:07.563268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:07.563514 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:07.563402 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:07.563514 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:07.563457 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret podName:4e7619c0-c870-4eac-9372-f539d9128cc8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:09.563440429 +0000 UTC m=+7.885208809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret") pod "global-pull-secret-syncer-n59sp" (UID: "4e7619c0-c870-4eac-9372-f539d9128cc8") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:08.250685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:08.250642 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:08.250845 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:08.250803 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:09.250575 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:09.250540 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:09.251027 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:09.250537 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:09.251027 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:09.250689 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:09.251027 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:09.250776 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:09.582354 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:09.582223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:09.582562 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:09.582403 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:09.582562 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:09.582483 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret podName:4e7619c0-c870-4eac-9372-f539d9128cc8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:13.582463254 +0000 UTC m=+11.904231647 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret") pod "global-pull-secret-syncer-n59sp" (UID: "4e7619c0-c870-4eac-9372-f539d9128cc8") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:10.250720 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:10.250683 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:10.251184 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:10.250866 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:10.792656 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:10.792597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:10.792852 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:10.792743 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:10.792852 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:10.792819 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:18.792798719 +0000 UTC m=+17.114567099 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:10.893815 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:10.893695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:10.893991 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:10.893861 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:10.893991 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:10.893883 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:10.893991 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:10.893895 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfk94 for pod openshift-network-diagnostics/network-check-target-lllsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:10.893991 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:10.893957 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94 podName:42281d48-d81c-48a1-ac06-753e55e4ae05 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:18.893939922 +0000 UTC m=+17.215708302 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfk94" (UniqueName: "kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94") pod "network-check-target-lllsm" (UID: "42281d48-d81c-48a1-ac06-753e55e4ae05") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:11.250054 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:11.249967 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:11.250054 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:11.250041 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:11.250279 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:11.250169 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:11.250336 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:11.250312 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:12.253370 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:12.251854 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:12.253370 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:12.252053 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:13.250195 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:13.250164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:13.250352 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:13.250164 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:13.250352 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:13.250295 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:13.250468 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:13.250371 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:13.612912 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:13.612866 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:13.613384 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:13.613024 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:13.613384 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:13.613120 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret podName:4e7619c0-c870-4eac-9372-f539d9128cc8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:21.613100202 +0000 UTC m=+19.934868594 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret") pod "global-pull-secret-syncer-n59sp" (UID: "4e7619c0-c870-4eac-9372-f539d9128cc8") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:14.250838 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:14.250801 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:14.251017 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:14.250933 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:15.250716 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:15.250676 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:15.251165 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:15.250683 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:15.251165 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:15.250795 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:15.251165 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:15.250889 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:16.252903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:16.252873 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:16.253386 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:16.252997 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:17.250541 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:17.250501 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:17.250704 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:17.250547 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:17.250704 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:17.250623 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:17.250813 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:17.250764 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:18.249952 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:18.249921 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:18.250426 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:18.250041 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:18.848065 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:18.848020 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:18.848272 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:18.848164 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:18.848272 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:18.848225 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:34.848209332 +0000 UTC m=+33.169977711 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:18.949331 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:18.949292 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:18.949534 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:18.949485 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:18.949534 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:18.949512 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:18.949534 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:18.949525 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfk94 for pod openshift-network-diagnostics/network-check-target-lllsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:18.949684 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:18.949588 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94 podName:42281d48-d81c-48a1-ac06-753e55e4ae05 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:34.949569333 +0000 UTC m=+33.271337710 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfk94" (UniqueName: "kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94") pod "network-check-target-lllsm" (UID: "42281d48-d81c-48a1-ac06-753e55e4ae05") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:19.250817 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:19.250729 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:19.251284 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:19.250729 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:19.251284 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:19.250873 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:19.251284 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:19.250924 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:20.250039 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:20.249997 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:20.250240 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:20.250172 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:21.250650 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:21.250615 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:21.251081 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:21.250621 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:21.251081 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:21.250784 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:21.251081 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:21.250848 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:21.668113 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:21.667823 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:21.668113 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:21.668035 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:21.668113 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:21.668081 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret podName:4e7619c0-c870-4eac-9372-f539d9128cc8 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:37.668069098 +0000 UTC m=+35.989837480 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret") pod "global-pull-secret-syncer-n59sp" (UID: "4e7619c0-c870-4eac-9372-f539d9128cc8") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:36:22.251284 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.251042 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:22.251886 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:22.251399 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:22.402967 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.402937 2571 generic.go:358] "Generic (PLEG): container finished" podID="9f0c92d069d6b1142c54197b5e850203" containerID="8d1cb4c03365c6287abd473f7db51fbcba2f6ee0ff875e8549a59a77ae440325" exitCode=0 Apr 22 18:36:22.403129 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.403014 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" event={"ID":"9f0c92d069d6b1142c54197b5e850203","Type":"ContainerDied","Data":"8d1cb4c03365c6287abd473f7db51fbcba2f6ee0ff875e8549a59a77ae440325"} Apr 22 18:36:22.404435 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.404405 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" event={"ID":"cfa5d89adc8adc45f4b9cd558035eee4","Type":"ContainerStarted","Data":"add52264a6aae98ff0df4389fb01f8b5a8805c06b727c9629139c2332f879026"} Apr 22 18:36:22.405719 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.405686 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-f8jk4" event={"ID":"f9c02c13-ac97-4da1-8f21-e45794600da6","Type":"ContainerStarted","Data":"dd4960ca562e0523dbc3f0afb0814eee21256239cf0d355356a93b4873a1cdc3"} Apr 22 18:36:22.407066 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.407048 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79" containerID="2f79b26cd278226705cfc4936957f468ed8511357cba799249b1f86ff2b8809e" exitCode=0 Apr 22 18:36:22.407131 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.407115 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerDied","Data":"2f79b26cd278226705cfc4936957f468ed8511357cba799249b1f86ff2b8809e"} Apr 22 18:36:22.408544 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.408501 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-djp9s" event={"ID":"4717a966-da61-4170-b33d-9c683e74d3aa","Type":"ContainerStarted","Data":"88529ddf2691188b9ffcedb08917f94534771afc311e415f451551509d247012"} Apr 22 18:36:22.410993 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.410976 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:36:22.411423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.411338 2571 generic.go:358] "Generic (PLEG): container finished" podID="4d26e750-5c11-4023-9012-7dd824eeda4f" containerID="52168a409552182ca4b1f13a3568e91f2ce2aa450b9a6c25306bfa057305f568" exitCode=1 Apr 22 18:36:22.411423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.411348 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"7785298a8dedea384f62ab44ed0b0279c18490d2a539ff8cba86a0d88206bb6c"} Apr 22 18:36:22.411423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.411371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"56f6f0f0edacdcbba7581fd8dfe046f046c42af5b87971a320f9db5b9cef0dab"} Apr 22 18:36:22.411423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.411386 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"d755e84eb468de781a52bba611e076981edabdd1a826ae7e16f5b160f6a8375d"} Apr 22 18:36:22.411423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.411400 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"db7953d18e75b7a1137b00d348336cf92c8df60dd491d9576b27ede9c8b7a40d"} Apr 22 18:36:22.411423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.411410 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerDied","Data":"52168a409552182ca4b1f13a3568e91f2ce2aa450b9a6c25306bfa057305f568"} Apr 22 18:36:22.411423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.411422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"d15d9e0cdd3c52dee4efe8227129554c2e9999c60b3b08231d2bb3ecef195a8b"} Apr 22 18:36:22.412950 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.412921 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gxtfk" event={"ID":"f75b87a2-8899-4b74-9e48-0ca63be22b47","Type":"ContainerStarted","Data":"f6b47121327ee2f57a5923aa9d584a26f51d4fedce622ff2439b7c919684be26"} Apr 22 18:36:22.414429 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.414404 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-wknrq" event={"ID":"b04ddd7d-9290-4198-9b08-9617306b7172","Type":"ContainerStarted","Data":"38cc8cdcc52ad9b8f4c1065b7d2e927c026876fd57024d14ecfbdf7ea0162ce1"} Apr 22 18:36:22.415820 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.415802 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hcck9" event={"ID":"34348a50-b524-4eb0-80d8-2866eaf0b1aa","Type":"ContainerStarted","Data":"11dca4d31700d6649ff17323116d3a3a43912a2deed702646cd3456fd4301780"} Apr 22 18:36:22.416955 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.416938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" event={"ID":"3a7e426e-76c8-4dc9-9425-11528ef0197e","Type":"ContainerStarted","Data":"b49b6fe6e423912418e02342cbc1e45ddf533067a67519aa89bb47f56bd50081"} Apr 22 18:36:22.436560 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.436523 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-wknrq" podStartSLOduration=2.532478023 podStartE2EDuration="20.436509857s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.497020785 +0000 UTC m=+1.818789162" lastFinishedPulling="2026-04-22 18:36:21.401052617 +0000 UTC m=+19.722820996" observedRunningTime="2026-04-22 18:36:22.436200885 +0000 UTC m=+20.757969282" watchObservedRunningTime="2026-04-22 18:36:22.436509857 +0000 UTC m=+20.758278267" Apr 22 18:36:22.450978 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.450933 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-djp9s" podStartSLOduration=2.594002182 podStartE2EDuration="20.450917771s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.523190964 +0000 UTC m=+1.844959340" lastFinishedPulling="2026-04-22 18:36:21.380106548 +0000 UTC m=+19.701874929" observedRunningTime="2026-04-22 18:36:22.450781519 +0000 UTC m=+20.772549918" watchObservedRunningTime="2026-04-22 18:36:22.450917771 +0000 UTC m=+20.772686169" Apr 22 18:36:22.488679 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.488636 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-85.ec2.internal" podStartSLOduration=19.488621971 podStartE2EDuration="19.488621971s" podCreationTimestamp="2026-04-22 18:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:22.488565644 +0000 UTC m=+20.810334041" watchObservedRunningTime="2026-04-22 18:36:22.488621971 +0000 UTC m=+20.810390376" Apr 22 18:36:22.502843 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.502802 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hcck9" podStartSLOduration=2.602022773 podStartE2EDuration="20.50278974s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.479426624 +0000 UTC m=+1.801195000" lastFinishedPulling="2026-04-22 18:36:21.38019358 +0000 UTC m=+19.701961967" observedRunningTime="2026-04-22 18:36:22.502587512 +0000 UTC m=+20.824355923" watchObservedRunningTime="2026-04-22 18:36:22.50278974 +0000 UTC m=+20.824558137" Apr 22 18:36:22.519015 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.518975 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-f8jk4" podStartSLOduration=2.409379166 podStartE2EDuration="20.518960921s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.542488717 +0000 UTC m=+1.864257097" lastFinishedPulling="2026-04-22 18:36:21.652070457 +0000 UTC m=+19.973838852" observedRunningTime="2026-04-22 18:36:22.518589701 +0000 UTC m=+20.840358100" watchObservedRunningTime="2026-04-22 18:36:22.518960921 +0000 UTC m=+20.840729318" Apr 22 18:36:22.537051 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:22.536951 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gxtfk" podStartSLOduration=2.679152489 podStartE2EDuration="20.536931123s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.546454454 +0000 UTC m=+1.868222830" lastFinishedPulling="2026-04-22 18:36:21.404233088 +0000 UTC m=+19.726001464" observedRunningTime="2026-04-22 18:36:22.535872011 +0000 UTC m=+20.857640410" watchObservedRunningTime="2026-04-22 18:36:22.536931123 +0000 UTC m=+20.858699522" Apr 22 18:36:23.250580 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.250403 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:23.250758 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.250466 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:23.250758 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:23.250676 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:23.250869 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:23.250759 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:23.311335 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.311308 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:36:23.420758 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.420670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" event={"ID":"3a7e426e-76c8-4dc9-9425-11528ef0197e","Type":"ContainerStarted","Data":"e865ea3d0b77f5e45b566e4b913b7e837f3c179fe8209cd69b754b59c26b37bf"} Apr 22 18:36:23.422563 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.422538 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" event={"ID":"9f0c92d069d6b1142c54197b5e850203","Type":"ContainerStarted","Data":"b31d645ea69a267daf061117f71c3231aa2537c6b627b15691f74d2edae7a467"} Apr 22 18:36:23.424179 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.424123 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qgkvh" event={"ID":"84ddb6a8-3bdd-4890-a3d5-b36eeb73829a","Type":"ContainerStarted","Data":"39a0fb1cd688efa24f0733ef93d19b2c90eb9931b2e126232d60afcda3a34256"} Apr 22 18:36:23.438843 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.438795 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-85.ec2.internal" podStartSLOduration=20.438782028 podStartE2EDuration="20.438782028s" podCreationTimestamp="2026-04-22 18:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:36:23.438541494 +0000 UTC m=+21.760309896" watchObservedRunningTime="2026-04-22 18:36:23.438782028 +0000 UTC m=+21.760550427" Apr 22 18:36:23.452959 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:23.452891 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qgkvh" podStartSLOduration=3.534897377 podStartE2EDuration="21.452877213s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.462182899 +0000 UTC m=+1.783951277" lastFinishedPulling="2026-04-22 18:36:21.38016271 +0000 UTC m=+19.701931113" observedRunningTime="2026-04-22 18:36:23.452654012 +0000 UTC m=+21.774422412" watchObservedRunningTime="2026-04-22 18:36:23.452877213 +0000 UTC m=+21.774645610" Apr 22 18:36:24.185942 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.185836 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:36:23.31133279Z","UUID":"c2a60092-bb38-40d0-9448-63d9b516a657","Handler":null,"Name":"","Endpoint":""} Apr 22 18:36:24.189651 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.189627 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:36:24.189753 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.189659 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:36:24.250189 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.249994 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:24.250189 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:24.250154 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:24.428047 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.427962 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" event={"ID":"3a7e426e-76c8-4dc9-9425-11528ef0197e","Type":"ContainerStarted","Data":"e5b77c1a8df372ffe0d946e37a8f463c1c875b37e2b26d1ea0865a3d9a3a03b3"} Apr 22 18:36:24.431020 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.430987 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:36:24.431751 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.431397 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"e7006e118a0208bdb81ba0c2bda71dafbda86217b0997999d40c982c2a5a191f"} Apr 22 18:36:24.448135 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:24.448074 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-4l7w2" podStartSLOduration=1.808891134 podStartE2EDuration="22.448059382s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.437778342 +0000 UTC m=+1.759546718" lastFinishedPulling="2026-04-22 18:36:24.076946577 +0000 UTC m=+22.398714966" observedRunningTime="2026-04-22 18:36:24.44796547 +0000 UTC m=+22.769733867" watchObservedRunningTime="2026-04-22 18:36:24.448059382 +0000 UTC m=+22.769827782" Apr 22 18:36:25.250023 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:25.249989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:25.250221 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:25.249993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:25.250221 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:25.250130 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:25.250221 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:25.250188 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:25.686250 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:25.686218 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:25.687065 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:25.687044 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:26.250679 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:26.250647 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:26.250828 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:26.250781 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:26.438944 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:26.438786 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:36:26.439341 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:26.439318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"eb3f128a750221f7261cec75f2dfc293ea59c148cf0823e5a04d62b6f87c2a09"} Apr 22 18:36:26.439913 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:26.439892 2571 scope.go:117] "RemoveContainer" containerID="52168a409552182ca4b1f13a3568e91f2ce2aa450b9a6c25306bfa057305f568" Apr 22 18:36:27.075627 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.075601 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:27.076202 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.076184 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hcck9" Apr 22 18:36:27.250413 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.250345 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:27.250536 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.250347 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:27.250536 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:27.250439 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:27.250610 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:27.250536 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:27.442875 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.442841 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79" containerID="7b6fd49923a4ee312bdc248edd574fa63ad2aed1cebfffff9cef84ef505a13bd" exitCode=0 Apr 22 18:36:27.442997 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.442919 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerDied","Data":"7b6fd49923a4ee312bdc248edd574fa63ad2aed1cebfffff9cef84ef505a13bd"} Apr 22 18:36:27.446280 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.446265 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:36:27.446664 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.446641 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" event={"ID":"4d26e750-5c11-4023-9012-7dd824eeda4f","Type":"ContainerStarted","Data":"92135935fe61694be16d07c7c94623cc0bdd64f625de7aaeafdd0ac44b40439e"} Apr 22 18:36:27.446792 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.446775 2571 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 18:36:27.447017 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.447001 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:27.447128 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.447028 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:27.461061 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.461044 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:27.461184 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.461154 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:27.493961 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.493925 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" podStartSLOduration=7.352766367 podStartE2EDuration="25.493913991s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.462845218 +0000 UTC m=+1.784613597" lastFinishedPulling="2026-04-22 18:36:21.603992838 +0000 UTC m=+19.925761221" observedRunningTime="2026-04-22 18:36:27.492587985 +0000 UTC m=+25.814356384" watchObservedRunningTime="2026-04-22 18:36:27.493913991 +0000 UTC m=+25.815682389" Apr 22 18:36:27.573503 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:27.573472 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:36:28.250503 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.250292 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:28.250852 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:28.250632 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:28.288893 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.288371 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lllsm"] Apr 22 18:36:28.288893 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.288509 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:28.288893 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:28.288615 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:28.289182 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.288963 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-n59sp"] Apr 22 18:36:28.302252 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.302230 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vjdzq"] Apr 22 18:36:28.302365 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.302352 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:28.302459 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:28.302439 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:28.450470 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.450441 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79" containerID="7bcf2e86e7c200a0ff3eaf0a06512950c80509f266b9fc7530889c502bc9a763" exitCode=0 Apr 22 18:36:28.450613 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.450509 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:28.450613 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:28.450545 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerDied","Data":"7bcf2e86e7c200a0ff3eaf0a06512950c80509f266b9fc7530889c502bc9a763"} Apr 22 18:36:28.450858 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:28.450825 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:29.454551 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:29.454478 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79" containerID="f923c2b33e7d8ea4e8a27fd7c72e8a4b43e825171a9848bb9cbeac9f08115afc" exitCode=0 Apr 22 18:36:29.455170 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:29.454559 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerDied","Data":"f923c2b33e7d8ea4e8a27fd7c72e8a4b43e825171a9848bb9cbeac9f08115afc"} Apr 22 18:36:30.250287 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:30.250252 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:30.250449 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:30.250301 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:30.250449 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:30.250397 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:30.250449 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:30.250420 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:30.250559 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:30.250480 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:30.250559 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:30.250549 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:32.251949 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:32.251919 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:32.252642 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:32.252007 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:32.252642 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:32.252051 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:32.252642 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:32.252104 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:32.252642 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:32.252127 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:32.252642 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:32.252219 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:34.250791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.250752 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:34.251237 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.250752 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:34.251237 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.250752 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:34.251237 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.251005 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-lllsm" podUID="42281d48-d81c-48a1-ac06-753e55e4ae05" Apr 22 18:36:34.251237 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.251056 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-n59sp" podUID="4e7619c0-c870-4eac-9372-f539d9128cc8" Apr 22 18:36:34.251237 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.250884 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:36:34.490399 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.490371 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-85.ec2.internal" event="NodeReady" Apr 22 18:36:34.490585 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.490536 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:36:34.539376 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.539338 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jcxbc"] Apr 22 18:36:34.577083 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.577049 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8lhzb"] Apr 22 18:36:34.577245 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.577227 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:34.579793 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.579767 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:36:34.579901 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.579784 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:36:34.579901 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.579796 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qw5s\"" Apr 22 18:36:34.580008 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.579912 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:36:34.597377 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.597305 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jcxbc"] Apr 22 18:36:34.597487 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.597392 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8lhzb"] Apr 22 18:36:34.597487 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.597419 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.599827 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.599807 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:36:34.599923 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.599850 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4wrcz\"" Apr 22 18:36:34.599923 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.599858 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:36:34.664460 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.664430 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-config-volume\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.664600 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.664496 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-tmp-dir\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.664600 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.664556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l44bm\" (UniqueName: \"kubernetes.io/projected/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-kube-api-access-l44bm\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.664600 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.664578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:34.664764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.664606 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92st9\" (UniqueName: \"kubernetes.io/projected/8a27a543-b3c5-436c-8326-abb0c703e4d0-kube-api-access-92st9\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:34.664764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.664646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.765782 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.765746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-tmp-dir\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.765960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.765793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l44bm\" (UniqueName: \"kubernetes.io/projected/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-kube-api-access-l44bm\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.765960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.765817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:34.765960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.765843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92st9\" (UniqueName: \"kubernetes.io/projected/8a27a543-b3c5-436c-8326-abb0c703e4d0-kube-api-access-92st9\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:34.765960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.765875 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.765960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.765943 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-config-volume\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.766220 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.766120 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-tmp-dir\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.766277 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.766246 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:34.766329 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.766311 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:35.266290935 +0000 UTC m=+33.588059313 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:36:34.766474 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.766454 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:34.766556 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.766519 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:35.266501957 +0000 UTC m=+33.588270342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:36:34.766621 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.766585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-config-volume\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.776698 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.776573 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92st9\" (UniqueName: \"kubernetes.io/projected/8a27a543-b3c5-436c-8326-abb0c703e4d0-kube-api-access-92st9\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:34.776795 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.776571 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l44bm\" (UniqueName: \"kubernetes.io/projected/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-kube-api-access-l44bm\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:34.866820 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.866707 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:34.866985 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.866849 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:34.866985 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.866916 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:06.866897797 +0000 UTC m=+65.188666173 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:36:34.967918 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:34.967879 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:34.968124 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.968043 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:36:34.968124 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.968071 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:36:34.968124 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.968103 2571 projected.go:194] Error preparing data for projected volume kube-api-access-mfk94 for pod openshift-network-diagnostics/network-check-target-lllsm: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:34.968298 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:34.968184 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94 podName:42281d48-d81c-48a1-ac06-753e55e4ae05 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:06.96816128 +0000 UTC m=+65.289929657 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-mfk94" (UniqueName: "kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94") pod "network-check-target-lllsm" (UID: "42281d48-d81c-48a1-ac06-753e55e4ae05") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:36:35.270303 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:35.270273 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:35.270646 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:35.270311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:35.270646 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:35.270463 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:35.270646 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:35.270529 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:36.270509693 +0000 UTC m=+34.592278088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:36:35.270646 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:35.270549 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:35.270646 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:35.270586 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:36.270574478 +0000 UTC m=+34.592342861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:36:36.250708 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.250670 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:36:36.250921 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.250675 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:36.250921 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.250685 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:36:36.254018 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.253998 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:36:36.255015 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.254996 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:36:36.255149 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.255060 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:36:36.255212 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.255155 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vgxpp\"" Apr 22 18:36:36.255212 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.255188 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:36:36.255315 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.255221 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfh6w\"" Apr 22 18:36:36.278909 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.278888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:36.279210 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.278926 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:36.279210 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:36.279030 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:36.279210 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:36.279031 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:36.279210 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:36.279080 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:38.279066371 +0000 UTC m=+36.600834747 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:36:36.279210 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:36.279136 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:38.279108504 +0000 UTC m=+36.600876897 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:36:36.470164 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.470128 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79" containerID="9644dff56167d9d224194524b51b57e730efaaae0b530068f7f418eb6d3e4205" exitCode=0 Apr 22 18:36:36.470336 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:36.470179 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerDied","Data":"9644dff56167d9d224194524b51b57e730efaaae0b530068f7f418eb6d3e4205"} Apr 22 18:36:37.474413 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:37.474379 2571 generic.go:358] "Generic (PLEG): container finished" podID="a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79" containerID="4176729f2d0ecdebdeddf7554b6bb9826c33fafc3f8d808fdcc3b37efb3cf31a" exitCode=0 Apr 22 18:36:37.474758 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:37.474440 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerDied","Data":"4176729f2d0ecdebdeddf7554b6bb9826c33fafc3f8d808fdcc3b37efb3cf31a"} Apr 22 18:36:37.689675 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:37.689644 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:37.692173 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:37.692143 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4e7619c0-c870-4eac-9372-f539d9128cc8-original-pull-secret\") pod \"global-pull-secret-syncer-n59sp\" (UID: \"4e7619c0-c870-4eac-9372-f539d9128cc8\") " pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:37.767303 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:37.767283 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-n59sp" Apr 22 18:36:37.903335 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:37.903296 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-n59sp"] Apr 22 18:36:37.908324 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:36:37.908299 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e7619c0_c870_4eac_9372_f539d9128cc8.slice/crio-b0d806cfa7eac78aa5e8f46558a3d4a7fdb22deb8c430930a3daa559999139c0 WatchSource:0}: Error finding container b0d806cfa7eac78aa5e8f46558a3d4a7fdb22deb8c430930a3daa559999139c0: Status 404 returned error can't find the container with id b0d806cfa7eac78aa5e8f46558a3d4a7fdb22deb8c430930a3daa559999139c0 Apr 22 18:36:38.293361 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:38.293331 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:38.293537 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:38.293368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:38.293537 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:38.293486 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:38.293537 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:38.293506 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:38.293695 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:38.293551 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:42.293532064 +0000 UTC m=+40.615300444 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:36:38.293695 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:38.293569 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:42.293560002 +0000 UTC m=+40.615328384 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:36:38.477741 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:38.477698 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-n59sp" event={"ID":"4e7619c0-c870-4eac-9372-f539d9128cc8","Type":"ContainerStarted","Data":"b0d806cfa7eac78aa5e8f46558a3d4a7fdb22deb8c430930a3daa559999139c0"} Apr 22 18:36:38.481008 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:38.480978 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" event={"ID":"a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79","Type":"ContainerStarted","Data":"b15199e83adcd9ba44883f9d801e2de8f79e2959b19d0c167aecab88da2b5dd5"} Apr 22 18:36:38.507944 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:38.507890 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4dcm7" podStartSLOduration=4.6575264910000005 podStartE2EDuration="36.507874171s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:36:03.53315487 +0000 UTC m=+1.854923246" lastFinishedPulling="2026-04-22 18:36:35.383502547 +0000 UTC m=+33.705270926" observedRunningTime="2026-04-22 18:36:38.506536396 +0000 UTC m=+36.828304791" watchObservedRunningTime="2026-04-22 18:36:38.507874171 +0000 UTC m=+36.829642570" Apr 22 18:36:42.325254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:42.325223 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:42.325625 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:42.325265 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:42.325625 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:42.325352 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:42.325625 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:42.325377 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:42.325625 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:42.325409 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:50.325396452 +0000 UTC m=+48.647164828 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:36:42.325625 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:42.325449 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:36:50.325441182 +0000 UTC m=+48.647209558 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:36:42.489980 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:42.489938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-n59sp" event={"ID":"4e7619c0-c870-4eac-9372-f539d9128cc8","Type":"ContainerStarted","Data":"a6467604622d81f5667d26c24108f94378a496498b1b11a872d6c8ca3baf3bc2"} Apr 22 18:36:42.506224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:42.506170 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-n59sp" podStartSLOduration=33.610504918 podStartE2EDuration="37.506155661s" podCreationTimestamp="2026-04-22 18:36:05 +0000 UTC" firstStartedPulling="2026-04-22 18:36:37.909751327 +0000 UTC m=+36.231519703" lastFinishedPulling="2026-04-22 18:36:41.805402067 +0000 UTC m=+40.127170446" observedRunningTime="2026-04-22 18:36:42.505850299 +0000 UTC m=+40.827618722" watchObservedRunningTime="2026-04-22 18:36:42.506155661 +0000 UTC m=+40.827924061" Apr 22 18:36:50.381904 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:50.381862 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:36:50.382287 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:50.381917 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:36:50.382287 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:50.382034 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:36:50.382287 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:50.382034 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:36:50.382287 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:50.382128 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:06.382106848 +0000 UTC m=+64.703875239 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:36:50.382287 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:36:50.382155 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:06.382143018 +0000 UTC m=+64.703911401 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:36:59.471698 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:36:59.471663 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rx4st" Apr 22 18:37:06.389714 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:06.389672 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:37:06.390136 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:06.389746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:37:06.390136 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:06.389811 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:06.390136 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:06.389879 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:38.389863465 +0000 UTC m=+96.711631841 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:37:06.390136 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:06.389820 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:06.390136 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:06.389945 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:37:38.389932348 +0000 UTC m=+96.711700723 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:37:06.893602 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:06.893565 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:37:06.896521 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:06.896499 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:37:06.904157 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:06.904137 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:37:06.904219 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:06.904191 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:10.904176506 +0000 UTC m=+129.225944881 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : secret "metrics-daemon-secret" not found Apr 22 18:37:06.993839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:06.993796 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:37:07.007058 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:07.007033 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:37:07.028132 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:07.028109 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:37:07.037195 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:07.037175 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfk94\" (UniqueName: \"kubernetes.io/projected/42281d48-d81c-48a1-ac06-753e55e4ae05-kube-api-access-mfk94\") pod \"network-check-target-lllsm\" (UID: \"42281d48-d81c-48a1-ac06-753e55e4ae05\") " pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:37:07.174887 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:07.174814 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vgxpp\"" Apr 22 18:37:07.180862 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:07.180844 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:37:07.307881 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:07.307848 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-lllsm"] Apr 22 18:37:07.311558 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:37:07.311521 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42281d48_d81c_48a1_ac06_753e55e4ae05.slice/crio-495178ce7090cb7fe921b6de716be8cde45e8b217a9891726b8a2db006b15da3 WatchSource:0}: Error finding container 495178ce7090cb7fe921b6de716be8cde45e8b217a9891726b8a2db006b15da3: Status 404 returned error can't find the container with id 495178ce7090cb7fe921b6de716be8cde45e8b217a9891726b8a2db006b15da3 Apr 22 18:37:07.538766 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:07.538730 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lllsm" event={"ID":"42281d48-d81c-48a1-ac06-753e55e4ae05","Type":"ContainerStarted","Data":"495178ce7090cb7fe921b6de716be8cde45e8b217a9891726b8a2db006b15da3"} Apr 22 18:37:10.546190 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:10.546156 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-lllsm" event={"ID":"42281d48-d81c-48a1-ac06-753e55e4ae05","Type":"ContainerStarted","Data":"62ba0b78d2e86eb8dc684d98a741bec8fd370773a7a2636d473e09c979a5907f"} Apr 22 18:37:10.546640 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:10.546291 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:37:10.562796 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:10.562754 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-lllsm" podStartSLOduration=65.918062853 podStartE2EDuration="1m8.562742154s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:37:07.313889055 +0000 UTC m=+65.635657431" lastFinishedPulling="2026-04-22 18:37:09.958568341 +0000 UTC m=+68.280336732" observedRunningTime="2026-04-22 18:37:10.562650849 +0000 UTC m=+68.884419248" watchObservedRunningTime="2026-04-22 18:37:10.562742154 +0000 UTC m=+68.884510543" Apr 22 18:37:38.411689 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:38.411639 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:37:38.411689 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:38.411693 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:37:38.412183 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:38.411785 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:37:38.412183 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:38.411789 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:37:38.412183 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:38.411841 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls podName:3dbfdfa2-adbc-427e-8859-26bcaa36a0a7 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:42.411826369 +0000 UTC m=+160.733594745 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls") pod "dns-default-8lhzb" (UID: "3dbfdfa2-adbc-427e-8859-26bcaa36a0a7") : secret "dns-default-metrics-tls" not found Apr 22 18:37:38.412183 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:37:38.411858 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert podName:8a27a543-b3c5-436c-8326-abb0c703e4d0 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:42.411851549 +0000 UTC m=+160.733619925 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert") pod "ingress-canary-jcxbc" (UID: "8a27a543-b3c5-436c-8326-abb0c703e4d0") : secret "canary-serving-cert" not found Apr 22 18:37:41.550357 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:37:41.550328 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-lllsm" Apr 22 18:38:04.455153 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.455117 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d59f678db-bsx5q"] Apr 22 18:38:04.457898 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.457882 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.460356 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.460332 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:38:04.460486 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.460421 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:38:04.460609 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.460495 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vvpgg\"" Apr 22 18:38:04.460753 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.460737 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:38:04.466588 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.466560 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:38:04.470578 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.470543 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d59f678db-bsx5q"] Apr 22 18:38:04.558734 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.558701 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm"] Apr 22 18:38:04.561552 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.561534 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb"] Apr 22 18:38:04.561717 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.561696 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:04.563982 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.563966 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.564491 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.564473 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 18:38:04.565160 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.565146 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.565210 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.565188 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-mdc7s\"" Apr 22 18:38:04.565427 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.565410 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.566439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.566424 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 18:38:04.566644 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.566631 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-pnght\"" Apr 22 18:38:04.566719 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.566635 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.568014 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.567995 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.568341 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.568324 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 18:38:04.575044 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.575024 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm"] Apr 22 18:38:04.581392 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581366 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.581512 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581406 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5srjt\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-kube-api-access-5srjt\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.581512 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581473 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-certificates\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.581578 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581527 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-bound-sa-token\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.581611 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581586 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-trusted-ca\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.581653 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581637 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-image-registry-private-configuration\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.581697 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581662 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-ca-trust-extracted\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.581754 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.581734 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-installation-pull-secrets\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.586550 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.586521 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb"] Apr 22 18:38:04.655178 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.655146 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw"] Apr 22 18:38:04.657960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.657944 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.660533 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.660512 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-5kpdd\"" Apr 22 18:38:04.660648 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.660512 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:04.660698 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.660676 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 18:38:04.661223 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.661210 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 18:38:04.661272 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.661217 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 18:38:04.667367 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.667344 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw"] Apr 22 18:38:04.682105 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682071 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-image-registry-private-configuration\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682222 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682118 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-ca-trust-extracted\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682222 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682165 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:04.682325 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682299 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-installation-pull-secrets\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682384 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682384 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5srjt\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-kube-api-access-5srjt\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682478 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682408 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae20567a-fdd6-4700-8205-d7122697fdbb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.682478 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-certificates\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682583 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-bound-sa-token\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682583 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682506 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-ca-trust-extracted\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.682583 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682514 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq9kk\" (UniqueName: \"kubernetes.io/projected/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-kube-api-access-qq9kk\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:04.682583 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:04.682515 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:04.682583 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:04.682543 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59f678db-bsx5q: secret "image-registry-tls" not found Apr 22 18:38:04.682807 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:04.682597 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls podName:9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a nodeName:}" failed. No retries permitted until 2026-04-22 18:38:05.182582183 +0000 UTC m=+123.504350563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls") pod "image-registry-6d59f678db-bsx5q" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a") : secret "image-registry-tls" not found Apr 22 18:38:04.682807 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.682807 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682654 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw4jm\" (UniqueName: \"kubernetes.io/projected/ae20567a-fdd6-4700-8205-d7122697fdbb-kube-api-access-pw4jm\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.682807 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.682712 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-trusted-ca\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.683041 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.683017 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-certificates\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.683576 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.683553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-trusted-ca\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.684649 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.684624 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-image-registry-private-configuration\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.684729 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.684654 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-installation-pull-secrets\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.696436 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.696413 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-bound-sa-token\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.696436 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.696424 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5srjt\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-kube-api-access-5srjt\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:04.783851 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.783820 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae20567a-fdd6-4700-8205-d7122697fdbb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.783996 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.783858 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe419e0-6278-4785-8045-d733d349a280-config\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.783996 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.783886 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq9kk\" (UniqueName: \"kubernetes.io/projected/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-kube-api-access-qq9kk\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:04.783996 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.783979 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.784177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.784076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw4jm\" (UniqueName: \"kubernetes.io/projected/ae20567a-fdd6-4700-8205-d7122697fdbb-kube-api-access-pw4jm\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.784177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.784135 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfe419e0-6278-4785-8045-d733d349a280-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.784177 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:04.784142 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:04.784320 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.784179 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:04.784320 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:04.784214 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls podName:ae20567a-fdd6-4700-8205-d7122697fdbb nodeName:}" failed. No retries permitted until 2026-04-22 18:38:05.284193264 +0000 UTC m=+123.605961658 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kzgfb" (UID: "ae20567a-fdd6-4700-8205-d7122697fdbb") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:04.784320 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:04.784243 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:04.784320 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.784242 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhnhw\" (UniqueName: \"kubernetes.io/projected/dfe419e0-6278-4785-8045-d733d349a280-kube-api-access-lhnhw\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.784320 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:04.784288 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls podName:5b14e597-f2b9-4b24-b6fe-cd94c52580fa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:05.284275795 +0000 UTC m=+123.606044171 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gddm" (UID: "5b14e597-f2b9-4b24-b6fe-cd94c52580fa") : secret "samples-operator-tls" not found Apr 22 18:38:04.784609 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.784589 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae20567a-fdd6-4700-8205-d7122697fdbb-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.792836 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.792805 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq9kk\" (UniqueName: \"kubernetes.io/projected/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-kube-api-access-qq9kk\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:04.793157 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.793138 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw4jm\" (UniqueName: \"kubernetes.io/projected/ae20567a-fdd6-4700-8205-d7122697fdbb-kube-api-access-pw4jm\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:04.885449 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.885407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhnhw\" (UniqueName: \"kubernetes.io/projected/dfe419e0-6278-4785-8045-d733d349a280-kube-api-access-lhnhw\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.885631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.885457 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe419e0-6278-4785-8045-d733d349a280-config\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.885631 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.885529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfe419e0-6278-4785-8045-d733d349a280-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.885998 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.885976 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfe419e0-6278-4785-8045-d733d349a280-config\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.887737 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.887716 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/dfe419e0-6278-4785-8045-d733d349a280-serving-cert\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.894175 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.894158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhnhw\" (UniqueName: \"kubernetes.io/projected/dfe419e0-6278-4785-8045-d733d349a280-kube-api-access-lhnhw\") pod \"service-ca-operator-d6fc45fc5-c8fsw\" (UID: \"dfe419e0-6278-4785-8045-d733d349a280\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:04.967341 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:04.967300 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" Apr 22 18:38:05.082713 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:05.082677 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw"] Apr 22 18:38:05.085802 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:05.085776 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfe419e0_6278_4785_8045_d733d349a280.slice/crio-e7610ce6420a757821784ce3d1d7f062011621bb63c573838712e2e6c46b3f0c WatchSource:0}: Error finding container e7610ce6420a757821784ce3d1d7f062011621bb63c573838712e2e6c46b3f0c: Status 404 returned error can't find the container with id e7610ce6420a757821784ce3d1d7f062011621bb63c573838712e2e6c46b3f0c Apr 22 18:38:05.187573 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:05.187539 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:05.187756 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:05.187701 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:05.187756 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:05.187722 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59f678db-bsx5q: secret "image-registry-tls" not found Apr 22 18:38:05.187846 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:05.187774 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls podName:9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a nodeName:}" failed. No retries permitted until 2026-04-22 18:38:06.187758792 +0000 UTC m=+124.509527168 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls") pod "image-registry-6d59f678db-bsx5q" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a") : secret "image-registry-tls" not found Apr 22 18:38:05.288671 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:05.288589 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:05.288809 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:05.288679 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:05.288809 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:05.288771 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:05.288809 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:05.288774 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:05.288934 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:05.288829 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls podName:5b14e597-f2b9-4b24-b6fe-cd94c52580fa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:06.28881605 +0000 UTC m=+124.610584425 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gddm" (UID: "5b14e597-f2b9-4b24-b6fe-cd94c52580fa") : secret "samples-operator-tls" not found Apr 22 18:38:05.288934 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:05.288845 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls podName:ae20567a-fdd6-4700-8205-d7122697fdbb nodeName:}" failed. No retries permitted until 2026-04-22 18:38:06.288836253 +0000 UTC m=+124.610604629 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kzgfb" (UID: "ae20567a-fdd6-4700-8205-d7122697fdbb") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:05.656305 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:05.656217 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" event={"ID":"dfe419e0-6278-4785-8045-d733d349a280","Type":"ContainerStarted","Data":"e7610ce6420a757821784ce3d1d7f062011621bb63c573838712e2e6c46b3f0c"} Apr 22 18:38:06.195380 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:06.195322 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:06.195567 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:06.195474 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:06.195567 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:06.195495 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59f678db-bsx5q: secret "image-registry-tls" not found Apr 22 18:38:06.195567 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:06.195550 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls podName:9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a nodeName:}" failed. No retries permitted until 2026-04-22 18:38:08.195532624 +0000 UTC m=+126.517300999 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls") pod "image-registry-6d59f678db-bsx5q" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a") : secret "image-registry-tls" not found Apr 22 18:38:06.296563 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:06.296529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:06.296724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:06.296628 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:06.296724 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:06.296687 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:06.296870 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:06.296741 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls podName:ae20567a-fdd6-4700-8205-d7122697fdbb nodeName:}" failed. No retries permitted until 2026-04-22 18:38:08.296728619 +0000 UTC m=+126.618497000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kzgfb" (UID: "ae20567a-fdd6-4700-8205-d7122697fdbb") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:06.296870 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:06.296795 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:06.296870 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:06.296849 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls podName:5b14e597-f2b9-4b24-b6fe-cd94c52580fa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:08.296836328 +0000 UTC m=+126.618604704 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gddm" (UID: "5b14e597-f2b9-4b24-b6fe-cd94c52580fa") : secret "samples-operator-tls" not found Apr 22 18:38:07.661604 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:07.661564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" event={"ID":"dfe419e0-6278-4785-8045-d733d349a280","Type":"ContainerStarted","Data":"7434a7dedc2d4c0cde5cecd571c4b2aa7d625b014cce64faa6adbd8d63ca5539"} Apr 22 18:38:08.212545 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:08.212510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:08.212728 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:08.212629 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:08.212728 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:08.212641 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59f678db-bsx5q: secret "image-registry-tls" not found Apr 22 18:38:08.212728 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:08.212693 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls podName:9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a nodeName:}" failed. No retries permitted until 2026-04-22 18:38:12.21267918 +0000 UTC m=+130.534447556 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls") pod "image-registry-6d59f678db-bsx5q" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a") : secret "image-registry-tls" not found Apr 22 18:38:08.313270 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:08.313241 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:08.313427 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:08.313308 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:08.313427 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:08.313386 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:08.313500 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:08.313429 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:08.313500 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:08.313457 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls podName:5b14e597-f2b9-4b24-b6fe-cd94c52580fa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:12.313440512 +0000 UTC m=+130.635208888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gddm" (UID: "5b14e597-f2b9-4b24-b6fe-cd94c52580fa") : secret "samples-operator-tls" not found Apr 22 18:38:08.313500 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:08.313474 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls podName:ae20567a-fdd6-4700-8205-d7122697fdbb nodeName:}" failed. No retries permitted until 2026-04-22 18:38:12.313462562 +0000 UTC m=+130.635230938 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kzgfb" (UID: "ae20567a-fdd6-4700-8205-d7122697fdbb") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:09.112655 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.112600 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" podStartSLOduration=3.180919334 podStartE2EDuration="5.112583867s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="2026-04-22 18:38:05.087581601 +0000 UTC m=+123.409349977" lastFinishedPulling="2026-04-22 18:38:07.019246132 +0000 UTC m=+125.341014510" observedRunningTime="2026-04-22 18:38:07.682103154 +0000 UTC m=+126.003871542" watchObservedRunningTime="2026-04-22 18:38:09.112583867 +0000 UTC m=+127.434352264" Apr 22 18:38:09.113044 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.112818 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-452lr"] Apr 22 18:38:09.115741 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.115724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" Apr 22 18:38:09.119246 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.119222 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-2t55b\"" Apr 22 18:38:09.130398 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.130370 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-452lr"] Apr 22 18:38:09.206871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.206839 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk"] Apr 22 18:38:09.209851 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.209835 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" Apr 22 18:38:09.212800 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.212776 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 22 18:38:09.212934 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.212821 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 22 18:38:09.212934 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.212823 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-xr5pz\"" Apr 22 18:38:09.220218 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.220197 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk"] Apr 22 18:38:09.221411 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.221393 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jf5sr\" (UniqueName: \"kubernetes.io/projected/db99ee19-11b0-4246-bee1-d19d9ac8abc1-kube-api-access-jf5sr\") pod \"network-check-source-8894fc9bd-452lr\" (UID: \"db99ee19-11b0-4246-bee1-d19d9ac8abc1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" Apr 22 18:38:09.322357 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.322323 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2xf\" (UniqueName: \"kubernetes.io/projected/c415df20-fd9e-4f8b-9365-cf896cd78be4-kube-api-access-vh2xf\") pod \"migrator-74bb7799d9-88ztk\" (UID: \"c415df20-fd9e-4f8b-9365-cf896cd78be4\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" Apr 22 18:38:09.322524 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.322401 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jf5sr\" (UniqueName: \"kubernetes.io/projected/db99ee19-11b0-4246-bee1-d19d9ac8abc1-kube-api-access-jf5sr\") pod \"network-check-source-8894fc9bd-452lr\" (UID: \"db99ee19-11b0-4246-bee1-d19d9ac8abc1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" Apr 22 18:38:09.333708 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.333681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jf5sr\" (UniqueName: \"kubernetes.io/projected/db99ee19-11b0-4246-bee1-d19d9ac8abc1-kube-api-access-jf5sr\") pod \"network-check-source-8894fc9bd-452lr\" (UID: \"db99ee19-11b0-4246-bee1-d19d9ac8abc1\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" Apr 22 18:38:09.423364 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.423272 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2xf\" (UniqueName: \"kubernetes.io/projected/c415df20-fd9e-4f8b-9365-cf896cd78be4-kube-api-access-vh2xf\") pod \"migrator-74bb7799d9-88ztk\" (UID: \"c415df20-fd9e-4f8b-9365-cf896cd78be4\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" Apr 22 18:38:09.423547 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.423528 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" Apr 22 18:38:09.433412 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.433388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2xf\" (UniqueName: \"kubernetes.io/projected/c415df20-fd9e-4f8b-9365-cf896cd78be4-kube-api-access-vh2xf\") pod \"migrator-74bb7799d9-88ztk\" (UID: \"c415df20-fd9e-4f8b-9365-cf896cd78be4\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" Apr 22 18:38:09.518204 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.518104 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" Apr 22 18:38:09.540104 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.540063 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-452lr"] Apr 22 18:38:09.543004 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:09.542976 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb99ee19_11b0_4246_bee1_d19d9ac8abc1.slice/crio-b908881420a6ea3c8edd8dbfbf02e13cfa437522a630837c81afdac9daef4806 WatchSource:0}: Error finding container b908881420a6ea3c8edd8dbfbf02e13cfa437522a630837c81afdac9daef4806: Status 404 returned error can't find the container with id b908881420a6ea3c8edd8dbfbf02e13cfa437522a630837c81afdac9daef4806 Apr 22 18:38:09.644562 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.644529 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk"] Apr 22 18:38:09.648100 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:09.648054 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc415df20_fd9e_4f8b_9365_cf896cd78be4.slice/crio-9dfb33494647f7c2cd935326e912787f1d597f581f28f42c58f4be52e9255c63 WatchSource:0}: Error finding container 9dfb33494647f7c2cd935326e912787f1d597f581f28f42c58f4be52e9255c63: Status 404 returned error can't find the container with id 9dfb33494647f7c2cd935326e912787f1d597f581f28f42c58f4be52e9255c63 Apr 22 18:38:09.667240 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.667206 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" event={"ID":"c415df20-fd9e-4f8b-9365-cf896cd78be4","Type":"ContainerStarted","Data":"9dfb33494647f7c2cd935326e912787f1d597f581f28f42c58f4be52e9255c63"} Apr 22 18:38:09.668522 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.668494 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" event={"ID":"db99ee19-11b0-4246-bee1-d19d9ac8abc1","Type":"ContainerStarted","Data":"d9f743c5b0252a2059a074b5f3231cc7269e0ed298329ddb24428306ea03426f"} Apr 22 18:38:09.668629 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.668527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" event={"ID":"db99ee19-11b0-4246-bee1-d19d9ac8abc1","Type":"ContainerStarted","Data":"b908881420a6ea3c8edd8dbfbf02e13cfa437522a630837c81afdac9daef4806"} Apr 22 18:38:09.687581 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:09.687493 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-452lr" podStartSLOduration=0.687476603 podStartE2EDuration="687.476603ms" podCreationTimestamp="2026-04-22 18:38:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:09.687354794 +0000 UTC m=+128.009123189" watchObservedRunningTime="2026-04-22 18:38:09.687476603 +0000 UTC m=+128.009245002" Apr 22 18:38:10.936136 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:10.936080 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:38:10.936485 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:10.936240 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:38:10.936485 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:10.936308 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs podName:f01e5c64-eadd-49f5-a2f4-4953111daa69 nodeName:}" failed. No retries permitted until 2026-04-22 18:40:12.936291314 +0000 UTC m=+251.258059689 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs") pod "network-metrics-daemon-vjdzq" (UID: "f01e5c64-eadd-49f5-a2f4-4953111daa69") : secret "metrics-daemon-secret" not found Apr 22 18:38:11.674771 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:11.674731 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" event={"ID":"c415df20-fd9e-4f8b-9365-cf896cd78be4","Type":"ContainerStarted","Data":"1c62bfc031a7ca334bf208be6998ba14efcd68b9b6b16ce9b82681bcafe1a560"} Apr 22 18:38:11.674771 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:11.674769 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" event={"ID":"c415df20-fd9e-4f8b-9365-cf896cd78be4","Type":"ContainerStarted","Data":"2745a17d8d2b4e6d22c2fe5f3f35d849f37503205d92c39c6dd438b4c6214249"} Apr 22 18:38:11.693911 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:11.693867 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-88ztk" podStartSLOduration=1.440665128 podStartE2EDuration="2.693853346s" podCreationTimestamp="2026-04-22 18:38:09 +0000 UTC" firstStartedPulling="2026-04-22 18:38:09.650478104 +0000 UTC m=+127.972246497" lastFinishedPulling="2026-04-22 18:38:10.903666326 +0000 UTC m=+129.225434715" observedRunningTime="2026-04-22 18:38:11.692272005 +0000 UTC m=+130.014040414" watchObservedRunningTime="2026-04-22 18:38:11.693853346 +0000 UTC m=+130.015621744" Apr 22 18:38:12.248465 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:12.248423 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:12.248870 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:12.248565 2571 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:38:12.248870 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:12.248583 2571 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d59f678db-bsx5q: secret "image-registry-tls" not found Apr 22 18:38:12.248870 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:12.248633 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls podName:9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a nodeName:}" failed. No retries permitted until 2026-04-22 18:38:20.248618966 +0000 UTC m=+138.570387346 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls") pod "image-registry-6d59f678db-bsx5q" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a") : secret "image-registry-tls" not found Apr 22 18:38:12.349834 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:12.349797 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:12.350015 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:12.349888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:12.350015 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:12.349963 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:12.350167 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:12.350042 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls podName:ae20567a-fdd6-4700-8205-d7122697fdbb nodeName:}" failed. No retries permitted until 2026-04-22 18:38:20.350023479 +0000 UTC m=+138.671791874 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kzgfb" (UID: "ae20567a-fdd6-4700-8205-d7122697fdbb") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:12.350167 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:12.349968 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 18:38:12.350167 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:12.350115 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls podName:5b14e597-f2b9-4b24-b6fe-cd94c52580fa nodeName:}" failed. No retries permitted until 2026-04-22 18:38:20.350103788 +0000 UTC m=+138.671872164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-7gddm" (UID: "5b14e597-f2b9-4b24-b6fe-cd94c52580fa") : secret "samples-operator-tls" not found Apr 22 18:38:12.569779 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:12.569748 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gxtfk_f75b87a2-8899-4b74-9e48-0ca63be22b47/dns-node-resolver/0.log" Apr 22 18:38:13.568361 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:13.568333 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-djp9s_4717a966-da61-4170-b33d-9c683e74d3aa/node-ca/0.log" Apr 22 18:38:14.769227 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:14.769198 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-88ztk_c415df20-fd9e-4f8b-9365-cf896cd78be4/migrator/0.log" Apr 22 18:38:14.969050 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:14.969023 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-88ztk_c415df20-fd9e-4f8b-9365-cf896cd78be4/graceful-termination/0.log" Apr 22 18:38:20.311589 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.311554 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:20.313957 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.313924 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"image-registry-6d59f678db-bsx5q\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:20.369531 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.369507 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:20.412768 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.412462 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:20.412768 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:20.412568 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:20.412768 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.412578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:20.412768 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:20.412634 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls podName:ae20567a-fdd6-4700-8205-d7122697fdbb nodeName:}" failed. No retries permitted until 2026-04-22 18:38:36.412612843 +0000 UTC m=+154.734381236 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kzgfb" (UID: "ae20567a-fdd6-4700-8205-d7122697fdbb") : secret "cluster-monitoring-operator-tls" not found Apr 22 18:38:20.415795 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.415556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/5b14e597-f2b9-4b24-b6fe-cd94c52580fa-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-7gddm\" (UID: \"5b14e597-f2b9-4b24-b6fe-cd94c52580fa\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:20.472564 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.472536 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" Apr 22 18:38:20.490082 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.490052 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d59f678db-bsx5q"] Apr 22 18:38:20.493292 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:20.493251 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b546fbc_25fd_42a5_9c4e_c1f6344e9f6a.slice/crio-cd7b6d0110aef2d662602e9471953e7cb2b97d6cd2eccf13c70f8519ef0daacb WatchSource:0}: Error finding container cd7b6d0110aef2d662602e9471953e7cb2b97d6cd2eccf13c70f8519ef0daacb: Status 404 returned error can't find the container with id cd7b6d0110aef2d662602e9471953e7cb2b97d6cd2eccf13c70f8519ef0daacb Apr 22 18:38:20.592011 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.591931 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm"] Apr 22 18:38:20.699143 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.699077 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" event={"ID":"5b14e597-f2b9-4b24-b6fe-cd94c52580fa","Type":"ContainerStarted","Data":"36b7965a5484c524f7d9fbe3ede820381eebee823719cea5bbed6e33536e56c5"} Apr 22 18:38:20.700473 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.700434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" event={"ID":"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a","Type":"ContainerStarted","Data":"9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92"} Apr 22 18:38:20.700473 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.700467 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" event={"ID":"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a","Type":"ContainerStarted","Data":"cd7b6d0110aef2d662602e9471953e7cb2b97d6cd2eccf13c70f8519ef0daacb"} Apr 22 18:38:20.700653 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:20.700560 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:38:22.270325 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:22.270277 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" podStartSLOduration=18.27026205 podStartE2EDuration="18.27026205s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:38:20.750464602 +0000 UTC m=+139.072232999" watchObservedRunningTime="2026-04-22 18:38:22.27026205 +0000 UTC m=+140.592030448" Apr 22 18:38:22.706923 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:22.706894 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" event={"ID":"5b14e597-f2b9-4b24-b6fe-cd94c52580fa","Type":"ContainerStarted","Data":"69373b3398bb4c0106564516a715f7829d844a01dec3cf7130100398d8231cfc"} Apr 22 18:38:23.714830 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:23.714794 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" event={"ID":"5b14e597-f2b9-4b24-b6fe-cd94c52580fa","Type":"ContainerStarted","Data":"bcb1cc414e27c81c7d2fbdd6e8c0b02460933e45ac150587bc206198afc7fd0d"} Apr 22 18:38:23.732870 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:23.732825 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-7gddm" podStartSLOduration=17.737187601 podStartE2EDuration="19.732813045s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="2026-04-22 18:38:20.642428198 +0000 UTC m=+138.964196574" lastFinishedPulling="2026-04-22 18:38:22.638053641 +0000 UTC m=+140.959822018" observedRunningTime="2026-04-22 18:38:23.732223993 +0000 UTC m=+142.053992402" watchObservedRunningTime="2026-04-22 18:38:23.732813045 +0000 UTC m=+142.054581443" Apr 22 18:38:35.387077 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.387045 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j9zwr"] Apr 22 18:38:35.392440 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.392419 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.395888 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.395865 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:38:35.395991 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.395865 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-b54tz\"" Apr 22 18:38:35.395991 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.395898 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:38:35.395991 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.395937 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:38:35.395991 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.395865 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:38:35.401035 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.401013 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j9zwr"] Apr 22 18:38:35.436972 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.436945 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d59f678db-bsx5q"] Apr 22 18:38:35.441063 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.441034 2571 patch_prober.go:28] interesting pod/image-registry-6d59f678db-bsx5q container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:38:35.441174 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.441109 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:38:35.523199 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.523163 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8e21f1d0-e415-4dcb-b33c-327299da2218-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.523367 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.523216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8e21f1d0-e415-4dcb-b33c-327299da2218-crio-socket\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.523367 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.523309 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e21f1d0-e415-4dcb-b33c-327299da2218-data-volume\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.523367 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.523359 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8e21f1d0-e415-4dcb-b33c-327299da2218-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.523506 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.523478 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgm5l\" (UniqueName: \"kubernetes.io/projected/8e21f1d0-e415-4dcb-b33c-327299da2218-kube-api-access-cgm5l\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.624784 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.624751 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cgm5l\" (UniqueName: \"kubernetes.io/projected/8e21f1d0-e415-4dcb-b33c-327299da2218-kube-api-access-cgm5l\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.624962 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.624790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8e21f1d0-e415-4dcb-b33c-327299da2218-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.624962 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.624832 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8e21f1d0-e415-4dcb-b33c-327299da2218-crio-socket\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.624962 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.624864 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e21f1d0-e415-4dcb-b33c-327299da2218-data-volume\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.624962 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.624888 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8e21f1d0-e415-4dcb-b33c-327299da2218-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.625191 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.624975 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/8e21f1d0-e415-4dcb-b33c-327299da2218-crio-socket\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.625261 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.625245 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/8e21f1d0-e415-4dcb-b33c-327299da2218-data-volume\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.625391 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.625358 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/8e21f1d0-e415-4dcb-b33c-327299da2218-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.627259 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.627228 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/8e21f1d0-e415-4dcb-b33c-327299da2218-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.635244 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.635221 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgm5l\" (UniqueName: \"kubernetes.io/projected/8e21f1d0-e415-4dcb-b33c-327299da2218-kube-api-access-cgm5l\") pod \"insights-runtime-extractor-j9zwr\" (UID: \"8e21f1d0-e415-4dcb-b33c-327299da2218\") " pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.701711 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.701662 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j9zwr" Apr 22 18:38:35.820954 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:35.820916 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j9zwr"] Apr 22 18:38:35.824868 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:35.824838 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e21f1d0_e415_4dcb_b33c_327299da2218.slice/crio-4edfa1bf8b1bf914f5e05536a17d8117c3c1ab669e59b021bb076aab0f423d4b WatchSource:0}: Error finding container 4edfa1bf8b1bf914f5e05536a17d8117c3c1ab669e59b021bb076aab0f423d4b: Status 404 returned error can't find the container with id 4edfa1bf8b1bf914f5e05536a17d8117c3c1ab669e59b021bb076aab0f423d4b Apr 22 18:38:36.430131 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:36.430077 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:36.432387 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:36.432366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae20567a-fdd6-4700-8205-d7122697fdbb-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kzgfb\" (UID: \"ae20567a-fdd6-4700-8205-d7122697fdbb\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:36.677655 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:36.677627 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" Apr 22 18:38:36.746951 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:36.746923 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j9zwr" event={"ID":"8e21f1d0-e415-4dcb-b33c-327299da2218","Type":"ContainerStarted","Data":"b12b0653494cc72a7fa3287a2fd1493d7c8981c1dd46566ffa5c49137b1961de"} Apr 22 18:38:36.747100 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:36.746963 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j9zwr" event={"ID":"8e21f1d0-e415-4dcb-b33c-327299da2218","Type":"ContainerStarted","Data":"4edfa1bf8b1bf914f5e05536a17d8117c3c1ab669e59b021bb076aab0f423d4b"} Apr 22 18:38:36.847793 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:36.847763 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb"] Apr 22 18:38:36.851861 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:36.851829 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae20567a_fdd6_4700_8205_d7122697fdbb.slice/crio-6adbd444e80aeb9caeaa4236ddb211e296b12cfc524224f5f7909111c0f07246 WatchSource:0}: Error finding container 6adbd444e80aeb9caeaa4236ddb211e296b12cfc524224f5f7909111c0f07246: Status 404 returned error can't find the container with id 6adbd444e80aeb9caeaa4236ddb211e296b12cfc524224f5f7909111c0f07246 Apr 22 18:38:37.588774 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:37.588728 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jcxbc" podUID="8a27a543-b3c5-436c-8326-abb0c703e4d0" Apr 22 18:38:37.606894 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:37.606859 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-8lhzb" podUID="3dbfdfa2-adbc-427e-8859-26bcaa36a0a7" Apr 22 18:38:37.750525 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:37.750484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" event={"ID":"ae20567a-fdd6-4700-8205-d7122697fdbb","Type":"ContainerStarted","Data":"6adbd444e80aeb9caeaa4236ddb211e296b12cfc524224f5f7909111c0f07246"} Apr 22 18:38:37.752224 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:37.752191 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j9zwr" event={"ID":"8e21f1d0-e415-4dcb-b33c-327299da2218","Type":"ContainerStarted","Data":"8f5a184686acd3afbe52ff408e75d66735585cebcb53477c58cda59286e3cd29"} Apr 22 18:38:37.752348 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:37.752241 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:38:37.752348 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:37.752244 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8lhzb" Apr 22 18:38:38.757950 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:38.757911 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j9zwr" event={"ID":"8e21f1d0-e415-4dcb-b33c-327299da2218","Type":"ContainerStarted","Data":"15de9ae34b74f1779a42ab036d08293baf734c22157be0f14ee55321f4431818"} Apr 22 18:38:38.781558 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:38.781501 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j9zwr" podStartSLOduration=1.546116515 podStartE2EDuration="3.781482911s" podCreationTimestamp="2026-04-22 18:38:35 +0000 UTC" firstStartedPulling="2026-04-22 18:38:35.88158583 +0000 UTC m=+154.203354206" lastFinishedPulling="2026-04-22 18:38:38.116952223 +0000 UTC m=+156.438720602" observedRunningTime="2026-04-22 18:38:38.780280014 +0000 UTC m=+157.102048417" watchObservedRunningTime="2026-04-22 18:38:38.781482911 +0000 UTC m=+157.103251313" Apr 22 18:38:39.270609 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:39.270571 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-vjdzq" podUID="f01e5c64-eadd-49f5-a2f4-4953111daa69" Apr 22 18:38:39.761606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:39.761566 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" event={"ID":"ae20567a-fdd6-4700-8205-d7122697fdbb","Type":"ContainerStarted","Data":"9ddf1f652f890146a5e790c171efd30b95b7a17e7ef82226351269a67fe58840"} Apr 22 18:38:39.780506 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:39.780450 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kzgfb" podStartSLOduration=33.355477396 podStartE2EDuration="35.780432978s" podCreationTimestamp="2026-04-22 18:38:04 +0000 UTC" firstStartedPulling="2026-04-22 18:38:36.853671614 +0000 UTC m=+155.175440003" lastFinishedPulling="2026-04-22 18:38:39.278627206 +0000 UTC m=+157.600395585" observedRunningTime="2026-04-22 18:38:39.779288408 +0000 UTC m=+158.101056807" watchObservedRunningTime="2026-04-22 18:38:39.780432978 +0000 UTC m=+158.102201376" Apr 22 18:38:42.480019 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.479982 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:38:42.480556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.480083 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:38:42.482629 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.482602 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3dbfdfa2-adbc-427e-8859-26bcaa36a0a7-metrics-tls\") pod \"dns-default-8lhzb\" (UID: \"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7\") " pod="openshift-dns/dns-default-8lhzb" Apr 22 18:38:42.482779 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.482757 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8a27a543-b3c5-436c-8326-abb0c703e4d0-cert\") pod \"ingress-canary-jcxbc\" (UID: \"8a27a543-b3c5-436c-8326-abb0c703e4d0\") " pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:38:42.555986 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.555960 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-4wrcz\"" Apr 22 18:38:42.556867 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.556851 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-9qw5s\"" Apr 22 18:38:42.563654 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.563640 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8lhzb" Apr 22 18:38:42.563744 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.563723 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jcxbc" Apr 22 18:38:42.686928 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.686899 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jcxbc"] Apr 22 18:38:42.688643 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:42.688614 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a27a543_b3c5_436c_8326_abb0c703e4d0.slice/crio-2a8d4e7cbce64c459d964ddc32b358a6716f6a2de29d94a69e42a374c9a07769 WatchSource:0}: Error finding container 2a8d4e7cbce64c459d964ddc32b358a6716f6a2de29d94a69e42a374c9a07769: Status 404 returned error can't find the container with id 2a8d4e7cbce64c459d964ddc32b358a6716f6a2de29d94a69e42a374c9a07769 Apr 22 18:38:42.699694 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.699670 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8lhzb"] Apr 22 18:38:42.702662 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:42.702636 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dbfdfa2_adbc_427e_8859_26bcaa36a0a7.slice/crio-def29a8eeebd27102a7cea28845b387e64f0e17a9b937b7471932be4299b44fc WatchSource:0}: Error finding container def29a8eeebd27102a7cea28845b387e64f0e17a9b937b7471932be4299b44fc: Status 404 returned error can't find the container with id def29a8eeebd27102a7cea28845b387e64f0e17a9b937b7471932be4299b44fc Apr 22 18:38:42.768505 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.768474 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jcxbc" event={"ID":"8a27a543-b3c5-436c-8326-abb0c703e4d0","Type":"ContainerStarted","Data":"2a8d4e7cbce64c459d964ddc32b358a6716f6a2de29d94a69e42a374c9a07769"} Apr 22 18:38:42.769407 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:42.769381 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8lhzb" event={"ID":"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7","Type":"ContainerStarted","Data":"def29a8eeebd27102a7cea28845b387e64f0e17a9b937b7471932be4299b44fc"} Apr 22 18:38:44.776464 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:44.776435 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jcxbc" event={"ID":"8a27a543-b3c5-436c-8326-abb0c703e4d0","Type":"ContainerStarted","Data":"ccc746c1b847a2ab76aea60dacd9e19b53eceb464c95f26c2feebc34833851ce"} Apr 22 18:38:44.778226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:44.778202 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8lhzb" event={"ID":"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7","Type":"ContainerStarted","Data":"b55571a34d2d4fde9890cb83ccfb3feb2475832e58be9b9548dce6b77569b500"} Apr 22 18:38:44.778226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:44.778230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8lhzb" event={"ID":"3dbfdfa2-adbc-427e-8859-26bcaa36a0a7","Type":"ContainerStarted","Data":"a6e3af7f12a941503095e1bf568c8e71658492e823864154dc0d0f54bc08fd24"} Apr 22 18:38:44.778389 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:44.778376 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-8lhzb" Apr 22 18:38:44.794026 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:44.793954 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jcxbc" podStartSLOduration=128.944590603 podStartE2EDuration="2m10.793940116s" podCreationTimestamp="2026-04-22 18:36:34 +0000 UTC" firstStartedPulling="2026-04-22 18:38:42.690506191 +0000 UTC m=+161.012274567" lastFinishedPulling="2026-04-22 18:38:44.5398557 +0000 UTC m=+162.861624080" observedRunningTime="2026-04-22 18:38:44.792910864 +0000 UTC m=+163.114679262" watchObservedRunningTime="2026-04-22 18:38:44.793940116 +0000 UTC m=+163.115708513" Apr 22 18:38:44.809055 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:44.809013 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8lhzb" podStartSLOduration=128.975973055 podStartE2EDuration="2m10.80899687s" podCreationTimestamp="2026-04-22 18:36:34 +0000 UTC" firstStartedPulling="2026-04-22 18:38:42.704365267 +0000 UTC m=+161.026133657" lastFinishedPulling="2026-04-22 18:38:44.537389093 +0000 UTC m=+162.859157472" observedRunningTime="2026-04-22 18:38:44.808785125 +0000 UTC m=+163.130553524" watchObservedRunningTime="2026-04-22 18:38:44.80899687 +0000 UTC m=+163.130765268" Apr 22 18:38:45.440723 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:45.440693 2571 patch_prober.go:28] interesting pod/image-registry-6d59f678db-bsx5q container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:38:45.440877 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:45.440741 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:38:47.255069 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.255033 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9"] Apr 22 18:38:47.257322 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.257303 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.260184 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.260163 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8rsgp\"" Apr 22 18:38:47.260281 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.260199 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 22 18:38:47.261244 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.261226 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:38:47.261370 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.261339 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:38:47.272001 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.271979 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9"] Apr 22 18:38:47.272951 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.272930 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-s5kds"] Apr 22 18:38:47.275119 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.275076 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.276401 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.276365 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tr8d6"] Apr 22 18:38:47.278304 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.278287 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:38:47.278304 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.278299 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:38:47.278465 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.278424 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.278518 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.278489 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:38:47.278518 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.278509 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-bc9vl\"" Apr 22 18:38:47.281022 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.281004 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 18:38:47.281142 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.281012 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-x9xhj\"" Apr 22 18:38:47.281478 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.281461 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 18:38:47.281561 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.281520 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 18:38:47.296244 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.296225 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tr8d6"] Apr 22 18:38:47.319671 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319648 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.319763 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.319763 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319694 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.319763 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319710 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-wtmp\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.319897 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319805 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-tls\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.319897 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319845 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.319897 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319863 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.319897 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319877 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711d7979-e87b-47c3-9f49-79736b091f74-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.320112 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319900 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/241a6dfc-3317-476a-87a7-c45566d85531-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.320112 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.319950 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z277\" (UniqueName: \"kubernetes.io/projected/f7e2838f-4cf4-4061-b415-1736faeee81b-kube-api-access-8z277\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.320112 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320009 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-sys\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.320112 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320040 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-accelerators-collector-config\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.320112 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320068 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-textfile\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.320339 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320137 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbk4k\" (UniqueName: \"kubernetes.io/projected/711d7979-e87b-47c3-9f49-79736b091f74-kube-api-access-qbk4k\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.320339 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e2838f-4cf4-4061-b415-1736faeee81b-metrics-client-ca\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.320339 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320178 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxhlw\" (UniqueName: \"kubernetes.io/projected/241a6dfc-3317-476a-87a7-c45566d85531-kube-api-access-mxhlw\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.320339 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.320339 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320269 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-root\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.320339 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.320295 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/241a6dfc-3317-476a-87a7-c45566d85531-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.421405 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-sys\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421405 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421403 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-accelerators-collector-config\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421421 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-textfile\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421438 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qbk4k\" (UniqueName: \"kubernetes.io/projected/711d7979-e87b-47c3-9f49-79736b091f74-kube-api-access-qbk4k\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421456 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e2838f-4cf4-4061-b415-1736faeee81b-metrics-client-ca\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421475 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxhlw\" (UniqueName: \"kubernetes.io/projected/241a6dfc-3317-476a-87a7-c45566d85531-kube-api-access-mxhlw\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-sys\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421506 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-root\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421606 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421559 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/241a6dfc-3317-476a-87a7-c45566d85531-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421612 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-root\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421676 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421706 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421739 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421765 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-wtmp\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421819 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-tls\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421871 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:47.421884 2571 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421899 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711d7979-e87b-47c3-9f49-79736b091f74-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421915 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-textfile\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.421924 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/241a6dfc-3317-476a-87a7-c45566d85531-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.421971 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:47.421945 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-tls podName:241a6dfc-3317-476a-87a7-c45566d85531 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:47.9219244 +0000 UTC m=+166.243692778 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-tr8d6" (UID: "241a6dfc-3317-476a-87a7-c45566d85531") : secret "kube-state-metrics-tls" not found Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:47.422010 2571 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:47.422020 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:47.422082 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-tls podName:711d7979-e87b-47c3-9f49-79736b091f74 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:47.92206904 +0000 UTC m=+166.243837421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-qw5j9" (UID: "711d7979-e87b-47c3-9f49-79736b091f74") : secret "openshift-state-metrics-tls" not found Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:47.422133 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-tls podName:f7e2838f-4cf4-4061-b415-1736faeee81b nodeName:}" failed. No retries permitted until 2026-04-22 18:38:47.922119284 +0000 UTC m=+166.243887661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-tls") pod "node-exporter-s5kds" (UID: "f7e2838f-4cf4-4061-b415-1736faeee81b") : secret "node-exporter-tls" not found Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.422163 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8z277\" (UniqueName: \"kubernetes.io/projected/f7e2838f-4cf4-4061-b415-1736faeee81b-kube-api-access-8z277\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.422185 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-accelerators-collector-config\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.422251 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7e2838f-4cf4-4061-b415-1736faeee81b-metrics-client-ca\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.422283 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.422575 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.422390 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-wtmp\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.422912 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.422637 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/711d7979-e87b-47c3-9f49-79736b091f74-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.423070 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.423046 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/241a6dfc-3317-476a-87a7-c45566d85531-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.423307 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.423288 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/241a6dfc-3317-476a-87a7-c45566d85531-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.424318 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.424294 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.424645 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.424630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.424689 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.424633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.429618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.429595 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbk4k\" (UniqueName: \"kubernetes.io/projected/711d7979-e87b-47c3-9f49-79736b091f74-kube-api-access-qbk4k\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.430277 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.430259 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z277\" (UniqueName: \"kubernetes.io/projected/f7e2838f-4cf4-4061-b415-1736faeee81b-kube-api-access-8z277\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.430359 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.430341 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxhlw\" (UniqueName: \"kubernetes.io/projected/241a6dfc-3317-476a-87a7-c45566d85531-kube-api-access-mxhlw\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.926346 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.926313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:47.926346 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.926345 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.926546 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.926378 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-tls\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.928685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.928656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/241a6dfc-3317-476a-87a7-c45566d85531-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-tr8d6\" (UID: \"241a6dfc-3317-476a-87a7-c45566d85531\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:47.929148 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.929121 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f7e2838f-4cf4-4061-b415-1736faeee81b-node-exporter-tls\") pod \"node-exporter-s5kds\" (UID: \"f7e2838f-4cf4-4061-b415-1736faeee81b\") " pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:47.929244 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:47.929125 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/711d7979-e87b-47c3-9f49-79736b091f74-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-qw5j9\" (UID: \"711d7979-e87b-47c3-9f49-79736b091f74\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:48.166676 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.166638 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" Apr 22 18:38:48.184659 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.184598 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-s5kds" Apr 22 18:38:48.189313 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.189284 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" Apr 22 18:38:48.195079 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:48.195041 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e2838f_4cf4_4061_b415_1736faeee81b.slice/crio-48e98e58802589e03323a7e7d31f45baaf03780a55310de38b628ae9bf6f5c8f WatchSource:0}: Error finding container 48e98e58802589e03323a7e7d31f45baaf03780a55310de38b628ae9bf6f5c8f: Status 404 returned error can't find the container with id 48e98e58802589e03323a7e7d31f45baaf03780a55310de38b628ae9bf6f5c8f Apr 22 18:38:48.221179 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.221151 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:38:48.224585 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.224563 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.230255 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.227438 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:38:48.230255 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.227746 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.234940 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.234972 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.235003 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.235207 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.235228 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.235738 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.235967 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-29qdl\"" Apr 22 18:38:48.236396 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.236207 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:38:48.240174 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.239328 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:38:48.328208 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.328176 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9"] Apr 22 18:38:48.330445 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330564 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330483 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330564 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330515 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330564 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330540 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330719 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330576 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330719 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330606 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-out\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330719 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330640 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330869 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330748 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330869 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330777 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-web-config\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330869 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330816 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.330869 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.331071 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330884 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.331071 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.330916 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwkbh\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-kube-api-access-cwkbh\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.332652 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:48.332629 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod711d7979_e87b_47c3_9f49_79736b091f74.slice/crio-7ccb68ef998520e4ef69a684edb635eba2e0d7cb5186c9fa57a79a956708b49f WatchSource:0}: Error finding container 7ccb68ef998520e4ef69a684edb635eba2e0d7cb5186c9fa57a79a956708b49f: Status 404 returned error can't find the container with id 7ccb68ef998520e4ef69a684edb635eba2e0d7cb5186c9fa57a79a956708b49f Apr 22 18:38:48.339495 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.339475 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-tr8d6"] Apr 22 18:38:48.343756 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:48.343733 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod241a6dfc_3317_476a_87a7_c45566d85531.slice/crio-15da5ac229bd680f74f0a9ac0c07d1119da373c978ff927fba2fbb83bf0c220b WatchSource:0}: Error finding container 15da5ac229bd680f74f0a9ac0c07d1119da373c978ff927fba2fbb83bf0c220b: Status 404 returned error can't find the container with id 15da5ac229bd680f74f0a9ac0c07d1119da373c978ff927fba2fbb83bf0c220b Apr 22 18:38:48.432046 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432015 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432186 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432062 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-out\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432186 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432108 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432186 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432175 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432304 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432199 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-web-config\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432304 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432304 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432261 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432304 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432467 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cwkbh\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-kube-api-access-cwkbh\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432467 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432355 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432467 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432467 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432426 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.432467 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.432452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.433786 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.433073 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.434273 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.434249 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.435802 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.435743 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.436901 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.436881 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.437639 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.437613 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-out\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.437780 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.437756 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.437893 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.437848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-tls-assets\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.438386 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.438339 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.438504 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.438480 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-web-config\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.438571 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.438484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-volume\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.439036 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.439020 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.439866 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.439841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.442772 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.442753 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwkbh\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-kube-api-access-cwkbh\") pod \"alertmanager-main-0\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.550245 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.550210 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:38:48.678323 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.678289 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:38:48.681895 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:48.681864 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5444bfb7_a5b8_4fd7_99b7_e740f2f985fc.slice/crio-254dc62566c9771c4566ecf067563e276b3c28a5e0df09ae62195d22492f2161 WatchSource:0}: Error finding container 254dc62566c9771c4566ecf067563e276b3c28a5e0df09ae62195d22492f2161: Status 404 returned error can't find the container with id 254dc62566c9771c4566ecf067563e276b3c28a5e0df09ae62195d22492f2161 Apr 22 18:38:48.791970 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.791910 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerStarted","Data":"254dc62566c9771c4566ecf067563e276b3c28a5e0df09ae62195d22492f2161"} Apr 22 18:38:48.793605 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.793549 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" event={"ID":"241a6dfc-3317-476a-87a7-c45566d85531","Type":"ContainerStarted","Data":"15da5ac229bd680f74f0a9ac0c07d1119da373c978ff927fba2fbb83bf0c220b"} Apr 22 18:38:48.796029 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.796006 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s5kds" event={"ID":"f7e2838f-4cf4-4061-b415-1736faeee81b","Type":"ContainerStarted","Data":"48e98e58802589e03323a7e7d31f45baaf03780a55310de38b628ae9bf6f5c8f"} Apr 22 18:38:48.798372 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.798342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" event={"ID":"711d7979-e87b-47c3-9f49-79736b091f74","Type":"ContainerStarted","Data":"fe2d09100190141c5fb96a8d6773d85e1b158f749b439fa55a7126f3a249f6d9"} Apr 22 18:38:48.798487 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.798378 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" event={"ID":"711d7979-e87b-47c3-9f49-79736b091f74","Type":"ContainerStarted","Data":"557ec95574b652683b3c6854c8a27b5ff2d0f05a0be35ec950380ca98dea675d"} Apr 22 18:38:48.798487 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:48.798393 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" event={"ID":"711d7979-e87b-47c3-9f49-79736b091f74","Type":"ContainerStarted","Data":"7ccb68ef998520e4ef69a684edb635eba2e0d7cb5186c9fa57a79a956708b49f"} Apr 22 18:38:49.802777 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:49.802744 2571 generic.go:358] "Generic (PLEG): container finished" podID="f7e2838f-4cf4-4061-b415-1736faeee81b" containerID="40f8349459f2506d5e71aafe7c81bf627b58593ed06f5f717f99ceaa47f3ad38" exitCode=0 Apr 22 18:38:49.803225 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:49.802809 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s5kds" event={"ID":"f7e2838f-4cf4-4061-b415-1736faeee81b","Type":"ContainerDied","Data":"40f8349459f2506d5e71aafe7c81bf627b58593ed06f5f717f99ceaa47f3ad38"} Apr 22 18:38:50.285563 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.285536 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-t9zsh"] Apr 22 18:38:50.288684 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.288658 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-t9zsh" Apr 22 18:38:50.291320 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.291296 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 18:38:50.291768 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.291750 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 18:38:50.291926 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.291459 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-z579c\"" Apr 22 18:38:50.297948 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.297907 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-t9zsh"] Apr 22 18:38:50.349815 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.349735 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbrh\" (UniqueName: \"kubernetes.io/projected/6d29fb6d-5501-456e-8bb6-c5b983b92684-kube-api-access-2fbrh\") pod \"downloads-6bcc868b7-t9zsh\" (UID: \"6d29fb6d-5501-456e-8bb6-c5b983b92684\") " pod="openshift-console/downloads-6bcc868b7-t9zsh" Apr 22 18:38:50.451012 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.450980 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbrh\" (UniqueName: \"kubernetes.io/projected/6d29fb6d-5501-456e-8bb6-c5b983b92684-kube-api-access-2fbrh\") pod \"downloads-6bcc868b7-t9zsh\" (UID: \"6d29fb6d-5501-456e-8bb6-c5b983b92684\") " pod="openshift-console/downloads-6bcc868b7-t9zsh" Apr 22 18:38:50.459360 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.459333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbrh\" (UniqueName: \"kubernetes.io/projected/6d29fb6d-5501-456e-8bb6-c5b983b92684-kube-api-access-2fbrh\") pod \"downloads-6bcc868b7-t9zsh\" (UID: \"6d29fb6d-5501-456e-8bb6-c5b983b92684\") " pod="openshift-console/downloads-6bcc868b7-t9zsh" Apr 22 18:38:50.599446 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.599411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-t9zsh" Apr 22 18:38:50.713572 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.713545 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-t9zsh"] Apr 22 18:38:50.716252 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:50.716221 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d29fb6d_5501_456e_8bb6_c5b983b92684.slice/crio-ec13fae526faf737809956a8e4d9dfd8fdba409fe90fb0527a955b4adbae5bc1 WatchSource:0}: Error finding container ec13fae526faf737809956a8e4d9dfd8fdba409fe90fb0527a955b4adbae5bc1: Status 404 returned error can't find the container with id ec13fae526faf737809956a8e4d9dfd8fdba409fe90fb0527a955b4adbae5bc1 Apr 22 18:38:50.808247 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.808189 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" event={"ID":"241a6dfc-3317-476a-87a7-c45566d85531","Type":"ContainerStarted","Data":"428f43d1410a6e89c1e6a2acb018d029371c450cce1ec90b3799e4842747b4e0"} Apr 22 18:38:50.808247 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.808228 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" event={"ID":"241a6dfc-3317-476a-87a7-c45566d85531","Type":"ContainerStarted","Data":"ce23e1566eab53dc9af96acdaa17774da24389917a9711d104a34d35aa4fb681"} Apr 22 18:38:50.808247 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.808240 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" event={"ID":"241a6dfc-3317-476a-87a7-c45566d85531","Type":"ContainerStarted","Data":"5e632aaebdfae43619ebb91a36be24dcac4e64d9e2c5463b58fa24c5ba55fe58"} Apr 22 18:38:50.809994 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.809972 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s5kds" event={"ID":"f7e2838f-4cf4-4061-b415-1736faeee81b","Type":"ContainerStarted","Data":"07c1260a3458b832545493bcbf8bd0d813204de4ac518366f17d292fa0eb0b9b"} Apr 22 18:38:50.810101 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.809997 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-s5kds" event={"ID":"f7e2838f-4cf4-4061-b415-1736faeee81b","Type":"ContainerStarted","Data":"83abda2facffe3533402563e6d12c18f0e472492835991c381beedaf0e3e27b0"} Apr 22 18:38:50.811858 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.811832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" event={"ID":"711d7979-e87b-47c3-9f49-79736b091f74","Type":"ContainerStarted","Data":"c69f662d847136c66e83fbd7b17411159c84243e301f638bf2b4857601367677"} Apr 22 18:38:50.812866 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.812847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-t9zsh" event={"ID":"6d29fb6d-5501-456e-8bb6-c5b983b92684","Type":"ContainerStarted","Data":"ec13fae526faf737809956a8e4d9dfd8fdba409fe90fb0527a955b4adbae5bc1"} Apr 22 18:38:50.814046 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.814024 2571 generic.go:358] "Generic (PLEG): container finished" podID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerID="e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36" exitCode=0 Apr 22 18:38:50.814121 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.814080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36"} Apr 22 18:38:50.828192 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.828157 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-tr8d6" podStartSLOduration=2.133639339 podStartE2EDuration="3.828143433s" podCreationTimestamp="2026-04-22 18:38:47 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.345879522 +0000 UTC m=+166.667647899" lastFinishedPulling="2026-04-22 18:38:50.040383607 +0000 UTC m=+168.362151993" observedRunningTime="2026-04-22 18:38:50.827653658 +0000 UTC m=+169.149422057" watchObservedRunningTime="2026-04-22 18:38:50.828143433 +0000 UTC m=+169.149911830" Apr 22 18:38:50.849109 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.848953 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-s5kds" podStartSLOduration=3.022883555 podStartE2EDuration="3.848940787s" podCreationTimestamp="2026-04-22 18:38:47 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.198383919 +0000 UTC m=+166.520152294" lastFinishedPulling="2026-04-22 18:38:49.02444113 +0000 UTC m=+167.346209526" observedRunningTime="2026-04-22 18:38:50.848669933 +0000 UTC m=+169.170438330" watchObservedRunningTime="2026-04-22 18:38:50.848940787 +0000 UTC m=+169.170709188" Apr 22 18:38:50.882696 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:50.882596 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-qw5j9" podStartSLOduration=2.338561998 podStartE2EDuration="3.88258117s" podCreationTimestamp="2026-04-22 18:38:47 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.495797646 +0000 UTC m=+166.817566023" lastFinishedPulling="2026-04-22 18:38:50.039816805 +0000 UTC m=+168.361585195" observedRunningTime="2026-04-22 18:38:50.882556954 +0000 UTC m=+169.204325353" watchObservedRunningTime="2026-04-22 18:38:50.88258117 +0000 UTC m=+169.204349569" Apr 22 18:38:51.455758 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.455722 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-86bf4ff869-c8r6f"] Apr 22 18:38:51.458223 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.458189 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.460953 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.460929 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 18:38:51.461192 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.461175 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 18:38:51.462256 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.461988 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-bllucasduqu75\"" Apr 22 18:38:51.462256 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.461994 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-q285c\"" Apr 22 18:38:51.462256 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.462079 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 18:38:51.462256 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.462013 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 18:38:51.469948 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.469900 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-86bf4ff869-c8r6f"] Apr 22 18:38:51.560140 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.560077 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/55ee5af3-880b-43e0-9488-372ee8e23ae3-metrics-server-audit-profiles\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.560316 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.560164 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-secret-metrics-server-tls\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.560316 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.560194 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-client-ca-bundle\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.560316 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.560226 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/55ee5af3-880b-43e0-9488-372ee8e23ae3-audit-log\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.560316 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.560252 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxwx9\" (UniqueName: \"kubernetes.io/projected/55ee5af3-880b-43e0-9488-372ee8e23ae3-kube-api-access-qxwx9\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.560545 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.560343 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-secret-metrics-server-client-certs\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.560545 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.560402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ee5af3-880b-43e0-9488-372ee8e23ae3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.661150 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661112 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ee5af3-880b-43e0-9488-372ee8e23ae3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.661309 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661227 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/55ee5af3-880b-43e0-9488-372ee8e23ae3-metrics-server-audit-profiles\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.661309 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661255 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-secret-metrics-server-tls\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.661309 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661280 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-client-ca-bundle\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.661475 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/55ee5af3-880b-43e0-9488-372ee8e23ae3-audit-log\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.661475 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxwx9\" (UniqueName: \"kubernetes.io/projected/55ee5af3-880b-43e0-9488-372ee8e23ae3-kube-api-access-qxwx9\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.661475 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-secret-metrics-server-client-certs\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.662275 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661940 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55ee5af3-880b-43e0-9488-372ee8e23ae3-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.662275 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.661963 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/55ee5af3-880b-43e0-9488-372ee8e23ae3-audit-log\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.662440 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.662272 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/55ee5af3-880b-43e0-9488-372ee8e23ae3-metrics-server-audit-profiles\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.664513 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.664462 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-secret-metrics-server-tls\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.664670 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.664608 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-secret-metrics-server-client-certs\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.664670 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.664647 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ee5af3-880b-43e0-9488-372ee8e23ae3-client-ca-bundle\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.670705 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.670662 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxwx9\" (UniqueName: \"kubernetes.io/projected/55ee5af3-880b-43e0-9488-372ee8e23ae3-kube-api-access-qxwx9\") pod \"metrics-server-86bf4ff869-c8r6f\" (UID: \"55ee5af3-880b-43e0-9488-372ee8e23ae3\") " pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.773501 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.773467 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:38:51.981214 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.981178 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6"] Apr 22 18:38:51.987501 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.987471 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:51.991355 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.990475 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-xzsl7\"" Apr 22 18:38:51.991355 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.991172 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 18:38:51.992585 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:51.992511 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6"] Apr 22 18:38:52.065666 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.065589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ce92ab3-5791-4660-bb4e-4740f84eb7d5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cvth6\" (UID: \"9ce92ab3-5791-4660-bb4e-4740f84eb7d5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:52.167267 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.167212 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ce92ab3-5791-4660-bb4e-4740f84eb7d5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cvth6\" (UID: \"9ce92ab3-5791-4660-bb4e-4740f84eb7d5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:52.167430 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:52.167366 2571 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 22 18:38:52.167487 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:38:52.167437 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ce92ab3-5791-4660-bb4e-4740f84eb7d5-monitoring-plugin-cert podName:9ce92ab3-5791-4660-bb4e-4740f84eb7d5 nodeName:}" failed. No retries permitted until 2026-04-22 18:38:52.667417181 +0000 UTC m=+170.989185559 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/9ce92ab3-5791-4660-bb4e-4740f84eb7d5-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-cvth6" (UID: "9ce92ab3-5791-4660-bb4e-4740f84eb7d5") : secret "monitoring-plugin-cert" not found Apr 22 18:38:52.292141 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.292116 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-86bf4ff869-c8r6f"] Apr 22 18:38:52.297690 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:52.297655 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ee5af3_880b_43e0_9488_372ee8e23ae3.slice/crio-70ae5231e2c045873322f958bc6f81116954d8914723db82909e91155f376d3b WatchSource:0}: Error finding container 70ae5231e2c045873322f958bc6f81116954d8914723db82909e91155f376d3b: Status 404 returned error can't find the container with id 70ae5231e2c045873322f958bc6f81116954d8914723db82909e91155f376d3b Apr 22 18:38:52.672452 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.672375 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ce92ab3-5791-4660-bb4e-4740f84eb7d5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cvth6\" (UID: \"9ce92ab3-5791-4660-bb4e-4740f84eb7d5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:52.675243 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.675219 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9ce92ab3-5791-4660-bb4e-4740f84eb7d5-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-cvth6\" (UID: \"9ce92ab3-5791-4660-bb4e-4740f84eb7d5\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:52.822480 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.822447 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerStarted","Data":"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef"} Apr 22 18:38:52.822618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.822485 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerStarted","Data":"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4"} Apr 22 18:38:52.822618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.822502 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerStarted","Data":"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6"} Apr 22 18:38:52.822618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.822514 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerStarted","Data":"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944"} Apr 22 18:38:52.822618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.822526 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerStarted","Data":"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec"} Apr 22 18:38:52.823417 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.823396 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" event={"ID":"55ee5af3-880b-43e0-9488-372ee8e23ae3","Type":"ContainerStarted","Data":"70ae5231e2c045873322f958bc6f81116954d8914723db82909e91155f376d3b"} Apr 22 18:38:52.900949 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:52.900920 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:53.047630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.047548 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6"] Apr 22 18:38:53.052271 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:53.052239 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce92ab3_5791_4660_bb4e_4740f84eb7d5.slice/crio-7457408d5acfe287edafaeab67359602b7522e68dcec2edae3135f6bf17db8ff WatchSource:0}: Error finding container 7457408d5acfe287edafaeab67359602b7522e68dcec2edae3135f6bf17db8ff: Status 404 returned error can't find the container with id 7457408d5acfe287edafaeab67359602b7522e68dcec2edae3135f6bf17db8ff Apr 22 18:38:53.433414 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.431251 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:38:53.449467 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.449442 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.452382 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.452353 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-ztvmr\"" Apr 22 18:38:53.453989 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.453948 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:38:53.455113 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.454908 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:38:53.455500 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.455478 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:38:53.455667 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.455641 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:38:53.455763 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.455743 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:38:53.457163 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.456842 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:38:53.457163 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.456876 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:38:53.457163 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.456879 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:38:53.457163 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.456942 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:38:53.457163 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.456884 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:38:53.457163 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.457022 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1j05c8btsius7\"" Apr 22 18:38:53.457163 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.457046 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:38:53.458133 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.458110 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:38:53.463778 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.463755 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:38:53.480225 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480204 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480474 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480455 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-web-config\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480659 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480760 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480707 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480826 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480803 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480875 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480836 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480875 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480974 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480892 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480974 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480931 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.480974 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.480955 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481290 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481036 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481290 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481110 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwpmx\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-kube-api-access-zwpmx\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481290 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481158 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481290 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481290 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481240 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481290 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481267 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481609 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481294 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config-out\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.481609 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.481317 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582412 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config-out\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582412 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582379 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582610 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582417 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582610 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582442 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-web-config\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582610 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582501 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582610 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582533 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582610 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582564 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582610 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582695 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582718 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582759 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582785 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwpmx\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-kube-api-access-zwpmx\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582817 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582845 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.582903 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.582890 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.585657 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.584338 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.585657 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.584971 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.585657 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.584992 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.586499 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.586472 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.587103 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.586971 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.587103 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.587035 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.587103 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.587042 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.587938 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.587896 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config-out\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.588316 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.588296 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.588521 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.588475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.589311 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.589271 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.589783 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.589422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.589783 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.589481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.589922 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.589846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.589922 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.589863 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.590189 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.590167 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-web-config\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.590309 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.590263 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.591800 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.591780 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.593424 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.593384 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwpmx\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-kube-api-access-zwpmx\") pod \"prometheus-k8s-0\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.766782 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.766595 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:38:53.829620 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.829581 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" event={"ID":"9ce92ab3-5791-4660-bb4e-4740f84eb7d5","Type":"ContainerStarted","Data":"7457408d5acfe287edafaeab67359602b7522e68dcec2edae3135f6bf17db8ff"} Apr 22 18:38:53.833896 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.833819 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerStarted","Data":"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed"} Apr 22 18:38:53.867374 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:53.867329 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.248987271 podStartE2EDuration="5.867313557s" podCreationTimestamp="2026-04-22 18:38:48 +0000 UTC" firstStartedPulling="2026-04-22 18:38:48.684242174 +0000 UTC m=+167.006010551" lastFinishedPulling="2026-04-22 18:38:53.30256845 +0000 UTC m=+171.624336837" observedRunningTime="2026-04-22 18:38:53.864669337 +0000 UTC m=+172.186437759" watchObservedRunningTime="2026-04-22 18:38:53.867313557 +0000 UTC m=+172.189081957" Apr 22 18:38:54.106473 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.106445 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:38:54.109181 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:38:54.109147 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93755d7c_1f4f_4ee2_b1c8_f17dc4cc4cf8.slice/crio-ccae2022e83a7e5b39b6009a46837aa8dba67129b807e31d2df3a6307c489d0a WatchSource:0}: Error finding container ccae2022e83a7e5b39b6009a46837aa8dba67129b807e31d2df3a6307c489d0a: Status 404 returned error can't find the container with id ccae2022e83a7e5b39b6009a46837aa8dba67129b807e31d2df3a6307c489d0a Apr 22 18:38:54.250458 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.250424 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:38:54.783162 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.783128 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8lhzb" Apr 22 18:38:54.843164 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.842607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" event={"ID":"55ee5af3-880b-43e0-9488-372ee8e23ae3","Type":"ContainerStarted","Data":"f679825aa7224fa4dcb1a725b8720010ea564b2ae8596846e002cf68d0c5be0b"} Apr 22 18:38:54.845321 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.845289 2571 generic.go:358] "Generic (PLEG): container finished" podID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" exitCode=0 Apr 22 18:38:54.846358 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.846275 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933"} Apr 22 18:38:54.846358 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.846324 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerStarted","Data":"ccae2022e83a7e5b39b6009a46837aa8dba67129b807e31d2df3a6307c489d0a"} Apr 22 18:38:54.861470 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:54.861392 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" podStartSLOduration=2.169699656 podStartE2EDuration="3.861377799s" podCreationTimestamp="2026-04-22 18:38:51 +0000 UTC" firstStartedPulling="2026-04-22 18:38:52.299765234 +0000 UTC m=+170.621533619" lastFinishedPulling="2026-04-22 18:38:53.99144338 +0000 UTC m=+172.313211762" observedRunningTime="2026-04-22 18:38:54.860608713 +0000 UTC m=+173.182377111" watchObservedRunningTime="2026-04-22 18:38:54.861377799 +0000 UTC m=+173.183146197" Apr 22 18:38:55.441746 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:55.441707 2571 patch_prober.go:28] interesting pod/image-registry-6d59f678db-bsx5q container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 18:38:55.442133 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:55.441767 2571 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:38:55.850890 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:55.850850 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" event={"ID":"9ce92ab3-5791-4660-bb4e-4740f84eb7d5","Type":"ContainerStarted","Data":"508a8100125e5b08a3f43598e322704e797b950cef4521134ff488d4cd4684f5"} Apr 22 18:38:55.869032 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:55.868987 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" podStartSLOduration=2.964909074 podStartE2EDuration="4.868970695s" podCreationTimestamp="2026-04-22 18:38:51 +0000 UTC" firstStartedPulling="2026-04-22 18:38:53.054720264 +0000 UTC m=+171.376488640" lastFinishedPulling="2026-04-22 18:38:54.958781866 +0000 UTC m=+173.280550261" observedRunningTime="2026-04-22 18:38:55.868237766 +0000 UTC m=+174.190006165" watchObservedRunningTime="2026-04-22 18:38:55.868970695 +0000 UTC m=+174.190739089" Apr 22 18:38:56.854265 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:56.854222 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:56.859765 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:56.859739 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-cvth6" Apr 22 18:38:57.861138 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:57.861107 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerStarted","Data":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} Apr 22 18:38:58.867930 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:58.867873 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerStarted","Data":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} Apr 22 18:38:59.877878 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:38:59.877847 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerStarted","Data":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} Apr 22 18:39:00.455503 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.455413 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerName="registry" containerID="cri-o://9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92" gracePeriod=30 Apr 22 18:39:00.698984 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.698963 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:39:00.758506 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758487 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-certificates\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758535 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5srjt\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-kube-api-access-5srjt\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758560 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-ca-trust-extracted\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758618 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758592 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-bound-sa-token\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758748 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758711 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-installation-pull-secrets\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758815 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758798 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758874 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758831 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-image-registry-private-configuration\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758874 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758867 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-trusted-ca\") pod \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\" (UID: \"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a\") " Apr 22 18:39:00.758966 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.758931 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:00.759382 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.759357 2571 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-certificates\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.759382 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.759365 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:39:00.761061 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.761034 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:00.761272 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.761242 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:00.761359 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.761279 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:00.761359 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.761294 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-kube-api-access-5srjt" (OuterVolumeSpecName: "kube-api-access-5srjt") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "kube-api-access-5srjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:39:00.761359 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.761328 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:39:00.767007 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.766984 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" (UID: "9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:39:00.860667 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.860642 2571 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-ca-trust-extracted\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.860667 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.860665 2571 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-bound-sa-token\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.860794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.860675 2571 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-installation-pull-secrets\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.860794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.860686 2571 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-registry-tls\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.860794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.860696 2571 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-image-registry-private-configuration\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.860794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.860705 2571 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-trusted-ca\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.860794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.860714 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5srjt\" (UniqueName: \"kubernetes.io/projected/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a-kube-api-access-5srjt\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:39:00.881849 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.881825 2571 generic.go:358] "Generic (PLEG): container finished" podID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerID="9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92" exitCode=0 Apr 22 18:39:00.882216 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.881884 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" Apr 22 18:39:00.882216 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.881906 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" event={"ID":"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a","Type":"ContainerDied","Data":"9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92"} Apr 22 18:39:00.882216 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.881938 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d59f678db-bsx5q" event={"ID":"9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a","Type":"ContainerDied","Data":"cd7b6d0110aef2d662602e9471953e7cb2b97d6cd2eccf13c70f8519ef0daacb"} Apr 22 18:39:00.882216 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.881956 2571 scope.go:117] "RemoveContainer" containerID="9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92" Apr 22 18:39:00.885214 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.885190 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerStarted","Data":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} Apr 22 18:39:00.885291 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.885223 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerStarted","Data":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} Apr 22 18:39:00.885291 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.885241 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerStarted","Data":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} Apr 22 18:39:00.891198 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.891006 2571 scope.go:117] "RemoveContainer" containerID="9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92" Apr 22 18:39:00.891363 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:39:00.891335 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92\": container with ID starting with 9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92 not found: ID does not exist" containerID="9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92" Apr 22 18:39:00.891443 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.891363 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92"} err="failed to get container status \"9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92\": rpc error: code = NotFound desc = could not find container \"9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92\": container with ID starting with 9f5a6188e1ce3764f62638fa5f9526677d60d3a558dc8a6581c5f04f576d9c92 not found: ID does not exist" Apr 22 18:39:00.919381 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.919327 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=3.011920895 podStartE2EDuration="7.919310275s" podCreationTimestamp="2026-04-22 18:38:53 +0000 UTC" firstStartedPulling="2026-04-22 18:38:54.847836689 +0000 UTC m=+173.169605073" lastFinishedPulling="2026-04-22 18:38:59.755226074 +0000 UTC m=+178.076994453" observedRunningTime="2026-04-22 18:39:00.915738225 +0000 UTC m=+179.237506625" watchObservedRunningTime="2026-04-22 18:39:00.919310275 +0000 UTC m=+179.241078675" Apr 22 18:39:00.929570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.929550 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d59f678db-bsx5q"] Apr 22 18:39:00.934151 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:00.934129 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d59f678db-bsx5q"] Apr 22 18:39:02.254949 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:02.254918 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" path="/var/lib/kubelet/pods/9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a/volumes" Apr 22 18:39:03.767288 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:03.767253 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:11.774511 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:11.774475 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:39:11.774976 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:11.774532 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:39:12.926446 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:12.926359 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-t9zsh" event={"ID":"6d29fb6d-5501-456e-8bb6-c5b983b92684","Type":"ContainerStarted","Data":"bad9fc3c70237d89207bcb780391b179621ab19a149837a7c7ef10139589d5f0"} Apr 22 18:39:12.926876 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:12.926789 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-t9zsh" Apr 22 18:39:12.928346 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:12.928307 2571 generic.go:358] "Generic (PLEG): container finished" podID="dfe419e0-6278-4785-8045-d733d349a280" containerID="7434a7dedc2d4c0cde5cecd571c4b2aa7d625b014cce64faa6adbd8d63ca5539" exitCode=0 Apr 22 18:39:12.928451 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:12.928412 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" event={"ID":"dfe419e0-6278-4785-8045-d733d349a280","Type":"ContainerDied","Data":"7434a7dedc2d4c0cde5cecd571c4b2aa7d625b014cce64faa6adbd8d63ca5539"} Apr 22 18:39:12.928755 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:12.928729 2571 scope.go:117] "RemoveContainer" containerID="7434a7dedc2d4c0cde5cecd571c4b2aa7d625b014cce64faa6adbd8d63ca5539" Apr 22 18:39:12.942681 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:12.942657 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-t9zsh" Apr 22 18:39:12.946434 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:12.946391 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-t9zsh" podStartSLOduration=1.5645511920000001 podStartE2EDuration="22.946374711s" podCreationTimestamp="2026-04-22 18:38:50 +0000 UTC" firstStartedPulling="2026-04-22 18:38:50.718305595 +0000 UTC m=+169.040073985" lastFinishedPulling="2026-04-22 18:39:12.100129128 +0000 UTC m=+190.421897504" observedRunningTime="2026-04-22 18:39:12.945920276 +0000 UTC m=+191.267688675" watchObservedRunningTime="2026-04-22 18:39:12.946374711 +0000 UTC m=+191.268143127" Apr 22 18:39:13.934322 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:13.934277 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-c8fsw" event={"ID":"dfe419e0-6278-4785-8045-d733d349a280","Type":"ContainerStarted","Data":"a864fba9003d66a418808634e087cbfa80a077bf51091c90a6135fbb407131ba"} Apr 22 18:39:27.440590 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:27.440549 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/init-config-reloader/0.log" Apr 22 18:39:27.640155 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:27.640129 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/alertmanager/0.log" Apr 22 18:39:27.841271 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:27.841245 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/config-reloader/0.log" Apr 22 18:39:28.039975 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:28.039943 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/kube-rbac-proxy-web/0.log" Apr 22 18:39:28.240967 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:28.240893 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/kube-rbac-proxy/0.log" Apr 22 18:39:28.439851 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:28.439822 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/kube-rbac-proxy-metric/0.log" Apr 22 18:39:28.644639 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:28.644604 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/prom-label-proxy/0.log" Apr 22 18:39:28.841922 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:28.841892 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-kzgfb_ae20567a-fdd6-4700-8205-d7122697fdbb/cluster-monitoring-operator/0.log" Apr 22 18:39:29.040947 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:29.040921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tr8d6_241a6dfc-3317-476a-87a7-c45566d85531/kube-state-metrics/0.log" Apr 22 18:39:29.240325 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:29.240298 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tr8d6_241a6dfc-3317-476a-87a7-c45566d85531/kube-rbac-proxy-main/0.log" Apr 22 18:39:29.440029 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:29.439960 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tr8d6_241a6dfc-3317-476a-87a7-c45566d85531/kube-rbac-proxy-self/0.log" Apr 22 18:39:29.641002 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:29.640974 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-86bf4ff869-c8r6f_55ee5af3-880b-43e0-9488-372ee8e23ae3/metrics-server/0.log" Apr 22 18:39:29.839518 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:29.839492 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cvth6_9ce92ab3-5791-4660-bb4e-4740f84eb7d5/monitoring-plugin/0.log" Apr 22 18:39:31.240689 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:31.240662 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s5kds_f7e2838f-4cf4-4061-b415-1736faeee81b/init-textfile/0.log" Apr 22 18:39:31.441704 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:31.441650 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s5kds_f7e2838f-4cf4-4061-b415-1736faeee81b/node-exporter/0.log" Apr 22 18:39:31.642788 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:31.642765 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s5kds_f7e2838f-4cf4-4061-b415-1736faeee81b/kube-rbac-proxy/0.log" Apr 22 18:39:31.779579 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:31.779548 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:39:31.783439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:31.783416 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-86bf4ff869-c8r6f" Apr 22 18:39:31.844948 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:31.844921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qw5j9_711d7979-e87b-47c3-9f49-79736b091f74/kube-rbac-proxy-main/0.log" Apr 22 18:39:32.041308 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:32.041278 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qw5j9_711d7979-e87b-47c3-9f49-79736b091f74/kube-rbac-proxy-self/0.log" Apr 22 18:39:32.240150 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:32.240120 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qw5j9_711d7979-e87b-47c3-9f49-79736b091f74/openshift-state-metrics/0.log" Apr 22 18:39:32.440303 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:32.440229 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/init-config-reloader/0.log" Apr 22 18:39:32.642304 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:32.642276 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/prometheus/0.log" Apr 22 18:39:32.841048 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:32.841024 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/config-reloader/0.log" Apr 22 18:39:33.040383 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:33.040357 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/thanos-sidecar/0.log" Apr 22 18:39:33.240967 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:33.240892 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/kube-rbac-proxy-web/0.log" Apr 22 18:39:33.440148 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:33.440121 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/kube-rbac-proxy/0.log" Apr 22 18:39:33.641159 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:33.641132 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/kube-rbac-proxy-thanos/0.log" Apr 22 18:39:36.641706 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:36.641676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-t9zsh_6d29fb6d-5501-456e-8bb6-c5b983b92684/download-server/0.log" Apr 22 18:39:37.241002 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:37.240973 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jcxbc_8a27a543-b3c5-436c-8326-abb0c703e4d0/serve-healthcheck-canary/0.log" Apr 22 18:39:53.767690 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:53.767654 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:53.804544 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:53.804520 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:39:54.073641 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:39:54.073567 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:07.488749 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:07.488719 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:07.489167 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:07.489145 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="alertmanager" containerID="cri-o://66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec" gracePeriod=120 Apr 22 18:40:07.489244 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:07.489203 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-metric" containerID="cri-o://8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef" gracePeriod=120 Apr 22 18:40:07.489300 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:07.489236 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-web" containerID="cri-o://9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6" gracePeriod=120 Apr 22 18:40:07.489300 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:07.489258 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="config-reloader" containerID="cri-o://e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944" gracePeriod=120 Apr 22 18:40:07.489417 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:07.489310 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy" containerID="cri-o://6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4" gracePeriod=120 Apr 22 18:40:07.489417 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:07.489269 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="prom-label-proxy" containerID="cri-o://25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed" gracePeriod=120 Apr 22 18:40:08.104661 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104622 2571 generic.go:358] "Generic (PLEG): container finished" podID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerID="25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed" exitCode=0 Apr 22 18:40:08.104661 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104652 2571 generic.go:358] "Generic (PLEG): container finished" podID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerID="6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4" exitCode=0 Apr 22 18:40:08.104661 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104659 2571 generic.go:358] "Generic (PLEG): container finished" podID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerID="e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944" exitCode=0 Apr 22 18:40:08.104661 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104665 2571 generic.go:358] "Generic (PLEG): container finished" podID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerID="66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec" exitCode=0 Apr 22 18:40:08.104960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104694 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed"} Apr 22 18:40:08.104960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104727 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4"} Apr 22 18:40:08.104960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944"} Apr 22 18:40:08.104960 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.104745 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec"} Apr 22 18:40:08.721436 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.721417 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:08.840478 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840408 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-volume\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840478 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840442 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-metrics-client-ca\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840478 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840459 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-trusted-ca-bundle\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840478 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840477 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-web-config\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840499 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwkbh\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-kube-api-access-cwkbh\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840524 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-web\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840575 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-cluster-tls-config\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840623 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-main-tls\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840660 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840700 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-tls-assets\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.840791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840748 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-metric\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.841177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840802 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-main-db\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.841177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840843 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-out\") pod \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\" (UID: \"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc\") " Apr 22 18:40:08.841177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840865 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:08.841177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.840879 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:08.841177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.841152 2571 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-metrics-client-ca\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.841177 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.841172 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.841654 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.841619 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:08.843416 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.843391 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-volume" (OuterVolumeSpecName: "config-volume") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:08.843796 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.843767 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-kube-api-access-cwkbh" (OuterVolumeSpecName: "kube-api-access-cwkbh") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "kube-api-access-cwkbh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:08.844058 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.844028 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:08.844844 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.844808 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:08.844933 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.844874 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:08.844933 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.844885 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-out" (OuterVolumeSpecName: "config-out") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:08.845021 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.844931 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:08.845021 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.845005 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:08.847741 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.847715 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:08.853685 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.853664 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-web-config" (OuterVolumeSpecName: "web-config") pod "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" (UID: "5444bfb7-a5b8-4fd7-99b7-e740f2f985fc"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:08.942117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942064 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942117 2571 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-alertmanager-main-db\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942128 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-out\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942138 2571 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-config-volume\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942147 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-web-config\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942156 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cwkbh\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-kube-api-access-cwkbh\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942166 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942175 2571 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-cluster-tls-config\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942185 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-main-tls\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942194 2571 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:08.942254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:08.942203 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc-tls-assets\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:09.109951 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.109888 2571 generic.go:358] "Generic (PLEG): container finished" podID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerID="8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef" exitCode=0 Apr 22 18:40:09.109951 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.109909 2571 generic.go:358] "Generic (PLEG): container finished" podID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerID="9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6" exitCode=0 Apr 22 18:40:09.110075 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.109966 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef"} Apr 22 18:40:09.110075 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.109998 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.110075 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.110013 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6"} Apr 22 18:40:09.110075 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.110025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"5444bfb7-a5b8-4fd7-99b7-e740f2f985fc","Type":"ContainerDied","Data":"254dc62566c9771c4566ecf067563e276b3c28a5e0df09ae62195d22492f2161"} Apr 22 18:40:09.110075 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.110041 2571 scope.go:117] "RemoveContainer" containerID="25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed" Apr 22 18:40:09.117793 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.117766 2571 scope.go:117] "RemoveContainer" containerID="8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef" Apr 22 18:40:09.123946 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.123930 2571 scope.go:117] "RemoveContainer" containerID="6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4" Apr 22 18:40:09.132887 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.132866 2571 scope.go:117] "RemoveContainer" containerID="9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6" Apr 22 18:40:09.134559 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.134539 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:09.140101 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.140068 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:09.140480 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.140466 2571 scope.go:117] "RemoveContainer" containerID="e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944" Apr 22 18:40:09.146498 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.146477 2571 scope.go:117] "RemoveContainer" containerID="66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec" Apr 22 18:40:09.152567 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.152552 2571 scope.go:117] "RemoveContainer" containerID="e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36" Apr 22 18:40:09.158464 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.158448 2571 scope.go:117] "RemoveContainer" containerID="25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed" Apr 22 18:40:09.158727 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:09.158705 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed\": container with ID starting with 25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed not found: ID does not exist" containerID="25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed" Apr 22 18:40:09.158820 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.158736 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed"} err="failed to get container status \"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed\": rpc error: code = NotFound desc = could not find container \"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed\": container with ID starting with 25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed not found: ID does not exist" Apr 22 18:40:09.158820 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.158754 2571 scope.go:117] "RemoveContainer" containerID="8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef" Apr 22 18:40:09.159074 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:09.159046 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef\": container with ID starting with 8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef not found: ID does not exist" containerID="8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef" Apr 22 18:40:09.159306 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.159082 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef"} err="failed to get container status \"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef\": rpc error: code = NotFound desc = could not find container \"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef\": container with ID starting with 8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef not found: ID does not exist" Apr 22 18:40:09.159306 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.159133 2571 scope.go:117] "RemoveContainer" containerID="6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4" Apr 22 18:40:09.159455 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:09.159428 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4\": container with ID starting with 6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4 not found: ID does not exist" containerID="6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4" Apr 22 18:40:09.159561 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.159464 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4"} err="failed to get container status \"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4\": rpc error: code = NotFound desc = could not find container \"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4\": container with ID starting with 6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4 not found: ID does not exist" Apr 22 18:40:09.159561 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.159486 2571 scope.go:117] "RemoveContainer" containerID="9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6" Apr 22 18:40:09.159777 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:09.159756 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6\": container with ID starting with 9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6 not found: ID does not exist" containerID="9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6" Apr 22 18:40:09.159822 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.159780 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6"} err="failed to get container status \"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6\": rpc error: code = NotFound desc = could not find container \"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6\": container with ID starting with 9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6 not found: ID does not exist" Apr 22 18:40:09.159822 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.159797 2571 scope.go:117] "RemoveContainer" containerID="e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944" Apr 22 18:40:09.160051 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:09.160035 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944\": container with ID starting with e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944 not found: ID does not exist" containerID="e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944" Apr 22 18:40:09.160120 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160058 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944"} err="failed to get container status \"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944\": rpc error: code = NotFound desc = could not find container \"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944\": container with ID starting with e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944 not found: ID does not exist" Apr 22 18:40:09.160120 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160073 2571 scope.go:117] "RemoveContainer" containerID="66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec" Apr 22 18:40:09.160302 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:09.160286 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec\": container with ID starting with 66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec not found: ID does not exist" containerID="66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec" Apr 22 18:40:09.160349 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160305 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec"} err="failed to get container status \"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec\": rpc error: code = NotFound desc = could not find container \"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec\": container with ID starting with 66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec not found: ID does not exist" Apr 22 18:40:09.160349 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160317 2571 scope.go:117] "RemoveContainer" containerID="e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36" Apr 22 18:40:09.160493 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:09.160478 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36\": container with ID starting with e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36 not found: ID does not exist" containerID="e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36" Apr 22 18:40:09.160532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160495 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36"} err="failed to get container status \"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36\": rpc error: code = NotFound desc = could not find container \"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36\": container with ID starting with e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36 not found: ID does not exist" Apr 22 18:40:09.160532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160506 2571 scope.go:117] "RemoveContainer" containerID="25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed" Apr 22 18:40:09.160657 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160643 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed"} err="failed to get container status \"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed\": rpc error: code = NotFound desc = could not find container \"25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed\": container with ID starting with 25f24c1385a7e554fb12ae2755fff9451856a54d68481a40cd3e23755b9381ed not found: ID does not exist" Apr 22 18:40:09.160657 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160657 2571 scope.go:117] "RemoveContainer" containerID="8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef" Apr 22 18:40:09.160810 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160796 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef"} err="failed to get container status \"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef\": rpc error: code = NotFound desc = could not find container \"8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef\": container with ID starting with 8a2f0b263046e62f74af10f0151df9774d2d6d0531e95285c2e2e17ae1549cef not found: ID does not exist" Apr 22 18:40:09.160849 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160809 2571 scope.go:117] "RemoveContainer" containerID="6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4" Apr 22 18:40:09.161011 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.160990 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4"} err="failed to get container status \"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4\": rpc error: code = NotFound desc = could not find container \"6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4\": container with ID starting with 6f8cd0782e82cec52598fee5eb00b578dee7dcd7fb3ac3ef1a0bda86981509c4 not found: ID does not exist" Apr 22 18:40:09.161080 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161014 2571 scope.go:117] "RemoveContainer" containerID="9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6" Apr 22 18:40:09.161215 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161200 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6"} err="failed to get container status \"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6\": rpc error: code = NotFound desc = could not find container \"9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6\": container with ID starting with 9e7f7be469f7ea6f7fc3097835f5521727913047202af82665b5b3aa34c509b6 not found: ID does not exist" Apr 22 18:40:09.161264 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161216 2571 scope.go:117] "RemoveContainer" containerID="e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944" Apr 22 18:40:09.161386 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161372 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944"} err="failed to get container status \"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944\": rpc error: code = NotFound desc = could not find container \"e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944\": container with ID starting with e209ad3ba2ce0b2d7decf5a20bcb3ff1307ca4851982e5391ac83eb31e863944 not found: ID does not exist" Apr 22 18:40:09.161433 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161386 2571 scope.go:117] "RemoveContainer" containerID="66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec" Apr 22 18:40:09.161556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161539 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec"} err="failed to get container status \"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec\": rpc error: code = NotFound desc = could not find container \"66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec\": container with ID starting with 66478a4a80a24ada5adae7f02fac50bd85b8a2d859a0b58ed9cd146e7cf381ec not found: ID does not exist" Apr 22 18:40:09.161602 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161556 2571 scope.go:117] "RemoveContainer" containerID="e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36" Apr 22 18:40:09.161693 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.161678 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36"} err="failed to get container status \"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36\": rpc error: code = NotFound desc = could not find container \"e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36\": container with ID starting with e2c59546c9a65973625812ba01e506da2d6705b4111ed0ce0c3f7ffdff395b36 not found: ID does not exist" Apr 22 18:40:09.168187 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168168 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:09.168487 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168474 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerName="registry" Apr 22 18:40:09.168532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168488 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerName="registry" Apr 22 18:40:09.168532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168499 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-metric" Apr 22 18:40:09.168532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168505 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-metric" Apr 22 18:40:09.168532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168514 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="alertmanager" Apr 22 18:40:09.168532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168520 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="alertmanager" Apr 22 18:40:09.168532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168531 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="config-reloader" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168539 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="config-reloader" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168550 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168555 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168562 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-web" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168567 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-web" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168572 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="init-config-reloader" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168577 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="init-config-reloader" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168587 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="prom-label-proxy" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168592 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="prom-label-proxy" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168643 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="config-reloader" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168653 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168659 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-metric" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168666 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="kube-rbac-proxy-web" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168672 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="alertmanager" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168679 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" containerName="prom-label-proxy" Apr 22 18:40:09.168724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.168688 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="9b546fbc-25fd-42a5-9c4e-c1f6344e9f6a" containerName="registry" Apr 22 18:40:09.173965 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.173949 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.176965 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.176924 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 22 18:40:09.177101 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.176953 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 22 18:40:09.177101 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.176987 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 22 18:40:09.177101 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.176991 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 22 18:40:09.177101 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.176966 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 22 18:40:09.177372 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.177176 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-29qdl\"" Apr 22 18:40:09.177510 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.177490 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 22 18:40:09.177614 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.177505 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 22 18:40:09.177815 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.177796 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 22 18:40:09.183277 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.183256 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 22 18:40:09.184157 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.184133 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:09.345269 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345269 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4944fb20-7fca-449f-8e4d-85aed430bac0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345292 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345318 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345340 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-web-config\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345392 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345425 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345453 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4944fb20-7fca-449f-8e4d-85aed430bac0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345486 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4944fb20-7fca-449f-8e4d-85aed430bac0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345547 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4944fb20-7fca-449f-8e4d-85aed430bac0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345578 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4944fb20-7fca-449f-8e4d-85aed430bac0-config-out\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345603 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-config-volume\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.345630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.345623 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jlws\" (UniqueName: \"kubernetes.io/projected/4944fb20-7fca-449f-8e4d-85aed430bac0-kube-api-access-7jlws\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446361 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446289 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-config-volume\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446361 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jlws\" (UniqueName: \"kubernetes.io/projected/4944fb20-7fca-449f-8e4d-85aed430bac0-kube-api-access-7jlws\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446361 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446347 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446619 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446369 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4944fb20-7fca-449f-8e4d-85aed430bac0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446619 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446619 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446418 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446619 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446434 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-web-config\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446619 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.446876 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.446654 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.447676 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.447135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4944fb20-7fca-449f-8e4d-85aed430bac0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.447676 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.447202 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4944fb20-7fca-449f-8e4d-85aed430bac0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.447676 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.447245 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4944fb20-7fca-449f-8e4d-85aed430bac0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.447676 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.447287 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4944fb20-7fca-449f-8e4d-85aed430bac0-config-out\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.447676 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.447403 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4944fb20-7fca-449f-8e4d-85aed430bac0-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.448661 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.448633 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4944fb20-7fca-449f-8e4d-85aed430bac0-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.448885 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.448860 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/4944fb20-7fca-449f-8e4d-85aed430bac0-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.449571 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.449520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.449679 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.449660 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/4944fb20-7fca-449f-8e4d-85aed430bac0-tls-assets\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.449748 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.449722 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.449973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.449949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.450077 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.450059 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-config-volume\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.450362 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.450342 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.450822 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.450799 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.451168 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.451149 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/4944fb20-7fca-449f-8e4d-85aed430bac0-web-config\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.451365 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.451348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/4944fb20-7fca-449f-8e4d-85aed430bac0-config-out\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.455968 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.455950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jlws\" (UniqueName: \"kubernetes.io/projected/4944fb20-7fca-449f-8e4d-85aed430bac0-kube-api-access-7jlws\") pod \"alertmanager-main-0\" (UID: \"4944fb20-7fca-449f-8e4d-85aed430bac0\") " pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.483847 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.483828 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 22 18:40:09.608777 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:09.607186 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 22 18:40:09.614453 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:40:09.614427 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4944fb20_7fca_449f_8e4d_85aed430bac0.slice/crio-345c49304c05692f2e01f42d94dc74d54ab8705a8e271400942a9d893417eef9 WatchSource:0}: Error finding container 345c49304c05692f2e01f42d94dc74d54ab8705a8e271400942a9d893417eef9: Status 404 returned error can't find the container with id 345c49304c05692f2e01f42d94dc74d54ab8705a8e271400942a9d893417eef9 Apr 22 18:40:10.114426 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:10.114401 2571 generic.go:358] "Generic (PLEG): container finished" podID="4944fb20-7fca-449f-8e4d-85aed430bac0" containerID="aab18883494acd52a536d937cad1f231560fbb4e7a241013ba930eb3147086a0" exitCode=0 Apr 22 18:40:10.114740 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:10.114492 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerDied","Data":"aab18883494acd52a536d937cad1f231560fbb4e7a241013ba930eb3147086a0"} Apr 22 18:40:10.114740 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:10.114531 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerStarted","Data":"345c49304c05692f2e01f42d94dc74d54ab8705a8e271400942a9d893417eef9"} Apr 22 18:40:10.256632 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:10.256608 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5444bfb7-a5b8-4fd7-99b7-e740f2f985fc" path="/var/lib/kubelet/pods/5444bfb7-a5b8-4fd7-99b7-e740f2f985fc/volumes" Apr 22 18:40:11.120457 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.120422 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerStarted","Data":"dea4b88996d634020c59ab5ad386eb8a7693e373e0d5f203390e36fc7de7a929"} Apr 22 18:40:11.120457 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.120460 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerStarted","Data":"0fd752043a56a0fb53896da3580a00f93318a9f0fe1f29985088b8ee8daeb9d8"} Apr 22 18:40:11.120910 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.120472 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerStarted","Data":"fe56002f62db7917bfb619ca0ed519b4ad378e1cec6538f1f1e1f3f2b00c2e73"} Apr 22 18:40:11.120910 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.120483 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerStarted","Data":"0a7e8bef5d4999f785b983758165933dc23a751ea2e5d8ac91d48e8d124c1be5"} Apr 22 18:40:11.120910 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.120494 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerStarted","Data":"7795adbedbda671e6c0bf6c4e431d2b13872ce7e43965784ee5b87fee3f90a30"} Apr 22 18:40:11.120910 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.120506 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"4944fb20-7fca-449f-8e4d-85aed430bac0","Type":"ContainerStarted","Data":"adfe61c9e3978904bcd7ad0b8e4ab0e38bfa24ff21438efb6a1488f19900b714"} Apr 22 18:40:11.150422 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.150372 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.150356999 podStartE2EDuration="2.150356999s" podCreationTimestamp="2026-04-22 18:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:40:11.147910876 +0000 UTC m=+249.469679277" watchObservedRunningTime="2026-04-22 18:40:11.150356999 +0000 UTC m=+249.472125445" Apr 22 18:40:11.517667 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.517630 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-69bd9b55df-dj8wt"] Apr 22 18:40:11.521166 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.521147 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.523890 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.523861 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 18:40:11.523890 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.523879 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-sxgsd\"" Apr 22 18:40:11.524055 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.524027 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 18:40:11.524144 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.524123 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 18:40:11.524205 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.524131 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 18:40:11.524205 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.524170 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 18:40:11.530439 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.530422 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 18:40:11.532959 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.532941 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69bd9b55df-dj8wt"] Apr 22 18:40:11.564583 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564557 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-serving-certs-ca-bundle\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.564668 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.564724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-telemeter-client-tls\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.564724 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564717 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.564794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564757 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-federate-client-tls\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.564794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-metrics-client-ca\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.564853 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564815 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-secret-telemeter-client\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.564853 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.564834 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgnl\" (UniqueName: \"kubernetes.io/projected/d88c4973-43ea-4185-bfdc-20f723bdc89b-kube-api-access-gkgnl\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665333 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665310 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-secret-telemeter-client\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665427 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgnl\" (UniqueName: \"kubernetes.io/projected/d88c4973-43ea-4185-bfdc-20f723bdc89b-kube-api-access-gkgnl\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665427 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665364 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-serving-certs-ca-bundle\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665427 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665382 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665540 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665450 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-telemeter-client-tls\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665540 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665642 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665557 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-federate-client-tls\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.665642 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.665578 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-metrics-client-ca\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.666186 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.666158 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-serving-certs-ca-bundle\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.666346 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.666323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-metrics-client-ca\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.666424 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.666330 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88c4973-43ea-4185-bfdc-20f723bdc89b-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.667919 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.667900 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-telemeter-client-tls\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.668000 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.667969 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.668239 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.668224 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-federate-client-tls\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.668323 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.668303 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d88c4973-43ea-4185-bfdc-20f723bdc89b-secret-telemeter-client\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.674035 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.674012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgnl\" (UniqueName: \"kubernetes.io/projected/d88c4973-43ea-4185-bfdc-20f723bdc89b-kube-api-access-gkgnl\") pod \"telemeter-client-69bd9b55df-dj8wt\" (UID: \"d88c4973-43ea-4185-bfdc-20f723bdc89b\") " pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.791673 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.791616 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:11.792042 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.792005 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="prometheus" containerID="cri-o://213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" gracePeriod=600 Apr 22 18:40:11.792173 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.792049 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-web" containerID="cri-o://d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" gracePeriod=600 Apr 22 18:40:11.792173 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.792025 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy" containerID="cri-o://749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" gracePeriod=600 Apr 22 18:40:11.792173 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.792078 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-thanos" containerID="cri-o://f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" gracePeriod=600 Apr 22 18:40:11.792173 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.792103 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="config-reloader" containerID="cri-o://03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" gracePeriod=600 Apr 22 18:40:11.792344 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.792025 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="thanos-sidecar" containerID="cri-o://fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" gracePeriod=600 Apr 22 18:40:11.833311 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.833289 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" Apr 22 18:40:11.980030 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:11.980005 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69bd9b55df-dj8wt"] Apr 22 18:40:11.983297 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:40:11.983268 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd88c4973_43ea_4185_bfdc_20f723bdc89b.slice/crio-6ad3c1dbc90a1051fe11319206fb9e83274ddcc75dbc93c31322562d9a01e375 WatchSource:0}: Error finding container 6ad3c1dbc90a1051fe11319206fb9e83274ddcc75dbc93c31322562d9a01e375: Status 404 returned error can't find the container with id 6ad3c1dbc90a1051fe11319206fb9e83274ddcc75dbc93c31322562d9a01e375 Apr 22 18:40:12.036556 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.036535 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.068065 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068000 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-kube-rbac-proxy\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068065 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068045 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-db\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068229 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068081 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config-out\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068229 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068152 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068229 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068178 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-rulefiles-0\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068231 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-thanos-prometheus-http-client-file\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068266 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-metrics-client-certs\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068299 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwpmx\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-kube-api-access-zwpmx\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068325 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-grpc-tls\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068365 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-trusted-ca-bundle\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068404 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-tls-assets\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068430 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-tls\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068479 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-kubelet-serving-ca-bundle\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068504 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-web-config\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068541 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-metrics-client-ca\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068563 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-serving-certs-ca-bundle\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068591 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068638 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\" (UID: \"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8\") " Apr 22 18:40:12.068839 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068794 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:12.069503 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.068917 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.069503 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.069266 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:12.070427 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.070360 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:12.070427 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.070409 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:12.070571 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.070467 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:12.071196 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.071154 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:40:12.072578 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.072551 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config-out" (OuterVolumeSpecName: "config-out") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:40:12.072961 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.072937 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-kube-api-access-zwpmx" (OuterVolumeSpecName: "kube-api-access-zwpmx") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "kube-api-access-zwpmx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:12.073049 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.073033 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.073462 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.073432 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.073665 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.073639 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:40:12.073733 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.073661 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.074468 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.074436 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.075179 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.075139 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.075862 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.075692 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config" (OuterVolumeSpecName: "config") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.076254 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.076227 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.076663 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.076641 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.086206 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.086186 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-web-config" (OuterVolumeSpecName: "web-config") pod "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" (UID: "93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:40:12.131053 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131024 2571 generic.go:358] "Generic (PLEG): container finished" podID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" exitCode=0 Apr 22 18:40:12.131053 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131049 2571 generic.go:358] "Generic (PLEG): container finished" podID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" exitCode=0 Apr 22 18:40:12.131053 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131058 2571 generic.go:358] "Generic (PLEG): container finished" podID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" exitCode=0 Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131066 2571 generic.go:358] "Generic (PLEG): container finished" podID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" exitCode=0 Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131074 2571 generic.go:358] "Generic (PLEG): container finished" podID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" exitCode=0 Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131080 2571 generic.go:358] "Generic (PLEG): container finished" podID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" exitCode=0 Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131120 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131168 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131176 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131201 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131215 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131230 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131235 2571 scope.go:117] "RemoveContainer" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.131553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.131245 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8","Type":"ContainerDied","Data":"ccae2022e83a7e5b39b6009a46837aa8dba67129b807e31d2df3a6307c489d0a"} Apr 22 18:40:12.132496 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.132475 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" event={"ID":"d88c4973-43ea-4185-bfdc-20f723bdc89b","Type":"ContainerStarted","Data":"6ad3c1dbc90a1051fe11319206fb9e83274ddcc75dbc93c31322562d9a01e375"} Apr 22 18:40:12.138760 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.138741 2571 scope.go:117] "RemoveContainer" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.147291 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.147274 2571 scope.go:117] "RemoveContainer" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.153938 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.153912 2571 scope.go:117] "RemoveContainer" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.156034 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.156015 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:12.160281 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.160255 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:12.161488 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.161473 2571 scope.go:117] "RemoveContainer" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.167613 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.167598 2571 scope.go:117] "RemoveContainer" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.170253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170131 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170157 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-web-config\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170178 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-metrics-client-ca\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170200 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170217 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170242 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170257 2571 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-kube-rbac-proxy\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170271 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-db\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170284 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config-out\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170299 2571 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-config\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170312 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170343 2571 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170365 2571 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-metrics-client-certs\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170382 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwpmx\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-kube-api-access-zwpmx\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170397 2571 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-grpc-tls\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170412 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-tls-assets\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.170569 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.170425 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:40:12.174451 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.174436 2571 scope.go:117] "RemoveContainer" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.180435 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.180419 2571 scope.go:117] "RemoveContainer" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.180676 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:12.180659 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": container with ID starting with f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83 not found: ID does not exist" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.180722 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.180689 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} err="failed to get container status \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": rpc error: code = NotFound desc = could not find container \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": container with ID starting with f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83 not found: ID does not exist" Apr 22 18:40:12.180722 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.180708 2571 scope.go:117] "RemoveContainer" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.180905 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:12.180887 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": container with ID starting with 749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c not found: ID does not exist" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.180969 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.180913 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} err="failed to get container status \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": rpc error: code = NotFound desc = could not find container \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": container with ID starting with 749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c not found: ID does not exist" Apr 22 18:40:12.180969 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.180934 2571 scope.go:117] "RemoveContainer" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.181153 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:12.181137 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": container with ID starting with d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0 not found: ID does not exist" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.181209 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181158 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} err="failed to get container status \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": rpc error: code = NotFound desc = could not find container \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": container with ID starting with d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0 not found: ID does not exist" Apr 22 18:40:12.181209 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181177 2571 scope.go:117] "RemoveContainer" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.181404 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:12.181386 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": container with ID starting with fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e not found: ID does not exist" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.181443 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181408 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} err="failed to get container status \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": rpc error: code = NotFound desc = could not find container \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": container with ID starting with fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e not found: ID does not exist" Apr 22 18:40:12.181443 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181435 2571 scope.go:117] "RemoveContainer" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.181633 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:12.181617 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": container with ID starting with 03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3 not found: ID does not exist" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.181675 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181636 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} err="failed to get container status \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": rpc error: code = NotFound desc = could not find container \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": container with ID starting with 03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3 not found: ID does not exist" Apr 22 18:40:12.181675 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181650 2571 scope.go:117] "RemoveContainer" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.181858 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:12.181843 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": container with ID starting with 213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9 not found: ID does not exist" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.181913 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181864 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} err="failed to get container status \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": rpc error: code = NotFound desc = could not find container \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": container with ID starting with 213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9 not found: ID does not exist" Apr 22 18:40:12.181913 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.181883 2571 scope.go:117] "RemoveContainer" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.182114 ip-10-0-131-85 kubenswrapper[2571]: E0422 18:40:12.182081 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": container with ID starting with 437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933 not found: ID does not exist" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.182173 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182118 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933"} err="failed to get container status \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": rpc error: code = NotFound desc = could not find container \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": container with ID starting with 437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933 not found: ID does not exist" Apr 22 18:40:12.182173 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182130 2571 scope.go:117] "RemoveContainer" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.182344 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182325 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} err="failed to get container status \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": rpc error: code = NotFound desc = could not find container \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": container with ID starting with f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83 not found: ID does not exist" Apr 22 18:40:12.182387 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182345 2571 scope.go:117] "RemoveContainer" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.182558 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182539 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} err="failed to get container status \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": rpc error: code = NotFound desc = could not find container \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": container with ID starting with 749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c not found: ID does not exist" Apr 22 18:40:12.182603 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182559 2571 scope.go:117] "RemoveContainer" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.182774 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182756 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} err="failed to get container status \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": rpc error: code = NotFound desc = could not find container \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": container with ID starting with d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0 not found: ID does not exist" Apr 22 18:40:12.182838 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.182776 2571 scope.go:117] "RemoveContainer" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.183032 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183016 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} err="failed to get container status \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": rpc error: code = NotFound desc = could not find container \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": container with ID starting with fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e not found: ID does not exist" Apr 22 18:40:12.183075 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183033 2571 scope.go:117] "RemoveContainer" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.183272 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183255 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} err="failed to get container status \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": rpc error: code = NotFound desc = could not find container \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": container with ID starting with 03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3 not found: ID does not exist" Apr 22 18:40:12.183355 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183273 2571 scope.go:117] "RemoveContainer" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.183468 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183450 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} err="failed to get container status \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": rpc error: code = NotFound desc = could not find container \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": container with ID starting with 213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9 not found: ID does not exist" Apr 22 18:40:12.183532 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183470 2571 scope.go:117] "RemoveContainer" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.183711 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183688 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933"} err="failed to get container status \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": rpc error: code = NotFound desc = could not find container \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": container with ID starting with 437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933 not found: ID does not exist" Apr 22 18:40:12.183711 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183710 2571 scope.go:117] "RemoveContainer" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.183961 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183935 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} err="failed to get container status \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": rpc error: code = NotFound desc = could not find container \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": container with ID starting with f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83 not found: ID does not exist" Apr 22 18:40:12.184135 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.183969 2571 scope.go:117] "RemoveContainer" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.184289 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.184261 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} err="failed to get container status \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": rpc error: code = NotFound desc = could not find container \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": container with ID starting with 749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c not found: ID does not exist" Apr 22 18:40:12.184379 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.184293 2571 scope.go:117] "RemoveContainer" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.184551 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.184528 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} err="failed to get container status \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": rpc error: code = NotFound desc = could not find container \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": container with ID starting with d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0 not found: ID does not exist" Apr 22 18:40:12.184636 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.184552 2571 scope.go:117] "RemoveContainer" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.184902 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.184875 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} err="failed to get container status \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": rpc error: code = NotFound desc = could not find container \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": container with ID starting with fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e not found: ID does not exist" Apr 22 18:40:12.184902 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.184900 2571 scope.go:117] "RemoveContainer" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.185200 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185183 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} err="failed to get container status \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": rpc error: code = NotFound desc = could not find container \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": container with ID starting with 03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3 not found: ID does not exist" Apr 22 18:40:12.185200 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185200 2571 scope.go:117] "RemoveContainer" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.185423 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185407 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} err="failed to get container status \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": rpc error: code = NotFound desc = could not find container \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": container with ID starting with 213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9 not found: ID does not exist" Apr 22 18:40:12.185475 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185424 2571 scope.go:117] "RemoveContainer" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.185648 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185629 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933"} err="failed to get container status \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": rpc error: code = NotFound desc = could not find container \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": container with ID starting with 437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933 not found: ID does not exist" Apr 22 18:40:12.185715 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185650 2571 scope.go:117] "RemoveContainer" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.185756 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185712 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:12.185889 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185873 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} err="failed to get container status \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": rpc error: code = NotFound desc = could not find container \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": container with ID starting with f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83 not found: ID does not exist" Apr 22 18:40:12.185942 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.185889 2571 scope.go:117] "RemoveContainer" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.186082 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186066 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="prometheus" Apr 22 18:40:12.186082 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186103 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="prometheus" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186118 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186126 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186127 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} err="failed to get container status \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": rpc error: code = NotFound desc = could not find container \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": container with ID starting with 749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c not found: ID does not exist" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186142 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="config-reloader" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186146 2571 scope.go:117] "RemoveContainer" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186150 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="config-reloader" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186161 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-web" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186169 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-web" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186181 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-thanos" Apr 22 18:40:12.186188 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186189 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-thanos" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186199 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="init-config-reloader" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186209 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="init-config-reloader" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186216 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="thanos-sidecar" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186221 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="thanos-sidecar" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186295 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="config-reloader" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186308 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-thanos" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186318 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186325 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="kube-rbac-proxy-web" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186332 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="prometheus" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186340 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" containerName="thanos-sidecar" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186353 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} err="failed to get container status \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": rpc error: code = NotFound desc = could not find container \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": container with ID starting with d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0 not found: ID does not exist" Apr 22 18:40:12.186570 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186367 2571 scope.go:117] "RemoveContainer" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.186956 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186578 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} err="failed to get container status \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": rpc error: code = NotFound desc = could not find container \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": container with ID starting with fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e not found: ID does not exist" Apr 22 18:40:12.186956 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186593 2571 scope.go:117] "RemoveContainer" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.186956 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186808 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} err="failed to get container status \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": rpc error: code = NotFound desc = could not find container \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": container with ID starting with 03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3 not found: ID does not exist" Apr 22 18:40:12.186956 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.186828 2571 scope.go:117] "RemoveContainer" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.187077 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187014 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} err="failed to get container status \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": rpc error: code = NotFound desc = could not find container \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": container with ID starting with 213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9 not found: ID does not exist" Apr 22 18:40:12.187077 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187027 2571 scope.go:117] "RemoveContainer" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.187246 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187230 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933"} err="failed to get container status \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": rpc error: code = NotFound desc = could not find container \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": container with ID starting with 437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933 not found: ID does not exist" Apr 22 18:40:12.187288 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187247 2571 scope.go:117] "RemoveContainer" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.187434 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187417 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} err="failed to get container status \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": rpc error: code = NotFound desc = could not find container \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": container with ID starting with f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83 not found: ID does not exist" Apr 22 18:40:12.187483 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187434 2571 scope.go:117] "RemoveContainer" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.187646 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187631 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} err="failed to get container status \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": rpc error: code = NotFound desc = could not find container \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": container with ID starting with 749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c not found: ID does not exist" Apr 22 18:40:12.187646 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187645 2571 scope.go:117] "RemoveContainer" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.187859 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187843 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} err="failed to get container status \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": rpc error: code = NotFound desc = could not find container \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": container with ID starting with d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0 not found: ID does not exist" Apr 22 18:40:12.187908 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.187859 2571 scope.go:117] "RemoveContainer" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.188067 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188046 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} err="failed to get container status \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": rpc error: code = NotFound desc = could not find container \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": container with ID starting with fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e not found: ID does not exist" Apr 22 18:40:12.188156 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188068 2571 scope.go:117] "RemoveContainer" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.188331 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188312 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} err="failed to get container status \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": rpc error: code = NotFound desc = could not find container \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": container with ID starting with 03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3 not found: ID does not exist" Apr 22 18:40:12.188414 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188332 2571 scope.go:117] "RemoveContainer" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.188554 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188539 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} err="failed to get container status \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": rpc error: code = NotFound desc = could not find container \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": container with ID starting with 213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9 not found: ID does not exist" Apr 22 18:40:12.188592 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188555 2571 scope.go:117] "RemoveContainer" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.188761 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188735 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933"} err="failed to get container status \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": rpc error: code = NotFound desc = could not find container \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": container with ID starting with 437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933 not found: ID does not exist" Apr 22 18:40:12.188808 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188764 2571 scope.go:117] "RemoveContainer" containerID="f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83" Apr 22 18:40:12.188987 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188970 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83"} err="failed to get container status \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": rpc error: code = NotFound desc = could not find container \"f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83\": container with ID starting with f00c9036c405e4176f21907509cf4c765a44888c3aaf6277279bc93b8a452f83 not found: ID does not exist" Apr 22 18:40:12.189029 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.188987 2571 scope.go:117] "RemoveContainer" containerID="749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c" Apr 22 18:40:12.189237 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189217 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c"} err="failed to get container status \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": rpc error: code = NotFound desc = could not find container \"749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c\": container with ID starting with 749c296815e5b09ad21af4f31f51c3fc5e3063c88bd8cf57ecb18a8bfab2d27c not found: ID does not exist" Apr 22 18:40:12.189291 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189237 2571 scope.go:117] "RemoveContainer" containerID="d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0" Apr 22 18:40:12.189464 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189431 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0"} err="failed to get container status \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": rpc error: code = NotFound desc = could not find container \"d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0\": container with ID starting with d542bc6bfb023fabf480120574bfd2bcc139335aca4f37e0489662422d7316d0 not found: ID does not exist" Apr 22 18:40:12.189464 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189448 2571 scope.go:117] "RemoveContainer" containerID="fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e" Apr 22 18:40:12.189652 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189637 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e"} err="failed to get container status \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": rpc error: code = NotFound desc = could not find container \"fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e\": container with ID starting with fcbbd9a54ffefa31185150ceb069efe9cb9e2669bf14d58b43e728fce6ba793e not found: ID does not exist" Apr 22 18:40:12.189694 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189652 2571 scope.go:117] "RemoveContainer" containerID="03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3" Apr 22 18:40:12.189905 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189876 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3"} err="failed to get container status \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": rpc error: code = NotFound desc = could not find container \"03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3\": container with ID starting with 03588ff049090d56ee2b28b47dfd15f367b3b0cd42b9fe616b3298d095e4fbc3 not found: ID does not exist" Apr 22 18:40:12.189905 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.189903 2571 scope.go:117] "RemoveContainer" containerID="213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9" Apr 22 18:40:12.190133 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.190116 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9"} err="failed to get container status \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": rpc error: code = NotFound desc = could not find container \"213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9\": container with ID starting with 213c405b3cbbc2062bcabaf3acf486cecb90e927de3e86b374a11e64bbf050b9 not found: ID does not exist" Apr 22 18:40:12.190199 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.190134 2571 scope.go:117] "RemoveContainer" containerID="437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933" Apr 22 18:40:12.190329 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.190311 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933"} err="failed to get container status \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": rpc error: code = NotFound desc = could not find container \"437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933\": container with ID starting with 437999050f9d3224d0d3f7bfd36ca75a521ce6c26174799b6f89bec6c4b87933 not found: ID does not exist" Apr 22 18:40:12.191677 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.191662 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.194484 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.194463 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 22 18:40:12.194615 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.194505 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 22 18:40:12.194615 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.194565 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 22 18:40:12.194716 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.194618 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 22 18:40:12.194798 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.194730 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 22 18:40:12.195117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.195099 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-ztvmr\"" Apr 22 18:40:12.195211 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.195173 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-1j05c8btsius7\"" Apr 22 18:40:12.195281 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.195254 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 22 18:40:12.195347 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.195311 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 22 18:40:12.195421 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.195405 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 22 18:40:12.195718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.195698 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 22 18:40:12.195794 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.195771 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 22 18:40:12.197718 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.197629 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 22 18:40:12.199482 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.199463 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 22 18:40:12.203133 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.203113 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:12.254852 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.254825 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8" path="/var/lib/kubelet/pods/93755d7c-1f4f-4ee2-b1c8-f17dc4cc4cf8/volumes" Apr 22 18:40:12.271438 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271415 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271545 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271448 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271545 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271545 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271582 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271630 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271780 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271639 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-config\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271780 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271674 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d294972f-269b-49b1-9403-a420af0aac12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271780 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271714 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-web-config\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271780 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271953 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271791 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271953 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271953 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271875 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.271953 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271914 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbjhj\" (UniqueName: \"kubernetes.io/projected/d294972f-269b-49b1-9403-a420af0aac12-kube-api-access-jbjhj\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.272117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271952 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d294972f-269b-49b1-9403-a420af0aac12-config-out\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.272117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271972 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.272117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.271988 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.272117 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.272010 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d294972f-269b-49b1-9403-a420af0aac12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372636 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372567 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbjhj\" (UniqueName: \"kubernetes.io/projected/d294972f-269b-49b1-9403-a420af0aac12-kube-api-access-jbjhj\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372636 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372607 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d294972f-269b-49b1-9403-a420af0aac12-config-out\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372797 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372629 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372797 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372658 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372797 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372682 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d294972f-269b-49b1-9403-a420af0aac12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372797 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372713 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372797 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372797 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372760 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.372797 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372808 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372834 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-config\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372909 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d294972f-269b-49b1-9403-a420af0aac12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372943 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-web-config\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.372983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.373027 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.373057 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.373119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.373226 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.373135 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d294972f-269b-49b1-9403-a420af0aac12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.374349 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.374320 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.375978 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.375947 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376067 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.375992 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376067 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.375999 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376628 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.376417 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376628 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.376422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376628 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.376515 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376827 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.376754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d294972f-269b-49b1-9403-a420af0aac12-config-out\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376827 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.376764 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d294972f-269b-49b1-9403-a420af0aac12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.376937 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.376913 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.377133 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.377076 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d294972f-269b-49b1-9403-a420af0aac12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.377598 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.377577 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.377967 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.377950 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.378424 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.378399 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-web-config\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.378744 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.378725 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.378837 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.378754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d294972f-269b-49b1-9403-a420af0aac12-config\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.382372 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.382352 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbjhj\" (UniqueName: \"kubernetes.io/projected/d294972f-269b-49b1-9403-a420af0aac12-kube-api-access-jbjhj\") pod \"prometheus-k8s-0\" (UID: \"d294972f-269b-49b1-9403-a420af0aac12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.501500 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.501474 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:40:12.635733 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.635646 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 22 18:40:12.639193 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:40:12.639162 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd294972f_269b_49b1_9403_a420af0aac12.slice/crio-c72f09277a4774ba0bd7538b62e208a6d0c8422fe31cf82d28300ec0255d437a WatchSource:0}: Error finding container c72f09277a4774ba0bd7538b62e208a6d0c8422fe31cf82d28300ec0255d437a: Status 404 returned error can't find the container with id c72f09277a4774ba0bd7538b62e208a6d0c8422fe31cf82d28300ec0255d437a Apr 22 18:40:12.978279 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.978206 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:40:12.980528 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:12.980501 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f01e5c64-eadd-49f5-a2f4-4953111daa69-metrics-certs\") pod \"network-metrics-daemon-vjdzq\" (UID: \"f01e5c64-eadd-49f5-a2f4-4953111daa69\") " pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:40:13.137301 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:13.137267 2571 generic.go:358] "Generic (PLEG): container finished" podID="d294972f-269b-49b1-9403-a420af0aac12" containerID="a44adad197f2a0ab318a0657cf47f619173d34eab2a01b947e621c01c081017d" exitCode=0 Apr 22 18:40:13.137736 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:13.137329 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerDied","Data":"a44adad197f2a0ab318a0657cf47f619173d34eab2a01b947e621c01c081017d"} Apr 22 18:40:13.137736 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:13.137350 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerStarted","Data":"c72f09277a4774ba0bd7538b62e208a6d0c8422fe31cf82d28300ec0255d437a"} Apr 22 18:40:13.153053 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:13.153032 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dfh6w\"" Apr 22 18:40:13.160861 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:13.160841 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vjdzq" Apr 22 18:40:13.304637 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:13.304590 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vjdzq"] Apr 22 18:40:13.307296 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:40:13.307264 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf01e5c64_eadd_49f5_a2f4_4953111daa69.slice/crio-5a952a2e9036a5f509d6e67e265a970e3927ad196efa83b21626a91b769f80ea WatchSource:0}: Error finding container 5a952a2e9036a5f509d6e67e265a970e3927ad196efa83b21626a91b769f80ea: Status 404 returned error can't find the container with id 5a952a2e9036a5f509d6e67e265a970e3927ad196efa83b21626a91b769f80ea Apr 22 18:40:14.143070 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.143035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" event={"ID":"d88c4973-43ea-4185-bfdc-20f723bdc89b","Type":"ContainerStarted","Data":"4cc076772c2bece512f580897cd6ff6107ef5e39b7bbb35d37b6e523f4bac648"} Apr 22 18:40:14.143419 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.143082 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" event={"ID":"d88c4973-43ea-4185-bfdc-20f723bdc89b","Type":"ContainerStarted","Data":"79f36e6e5cce046ef664c606dd04b0eb1b74bd2ff0ced7177140c1865c3211be"} Apr 22 18:40:14.146331 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.146303 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerStarted","Data":"827020a6b1db66e00e587c9f6d5190a9e9886e1bfae5f2dbdfbd4e2d359c382d"} Apr 22 18:40:14.146461 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.146342 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerStarted","Data":"2383e6259dadfc7a6482c3c086abf092144bad8e53b32935375d41d2ece0c5ee"} Apr 22 18:40:14.146461 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.146358 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerStarted","Data":"d1f7e53664b1581dd498bed23530c1c0965455300922f0a9ecfc6991a2fd5e83"} Apr 22 18:40:14.146461 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.146371 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerStarted","Data":"3354253966dc940254bbf97abf8253d6e001693cd5fd179c62c0ac53490e390f"} Apr 22 18:40:14.146461 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.146387 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerStarted","Data":"26b7f200661d782428507784bc378eaa6ff6c938e793d8018b0868ec202dbb54"} Apr 22 18:40:14.146461 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.146401 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d294972f-269b-49b1-9403-a420af0aac12","Type":"ContainerStarted","Data":"161c7f10af9b62b27717a5d53c9fbdce6565cb0e523c11261c7deac5b633db8e"} Apr 22 18:40:14.147978 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.147950 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjdzq" event={"ID":"f01e5c64-eadd-49f5-a2f4-4953111daa69","Type":"ContainerStarted","Data":"5a952a2e9036a5f509d6e67e265a970e3927ad196efa83b21626a91b769f80ea"} Apr 22 18:40:14.176397 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:14.176337 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.176316871 podStartE2EDuration="2.176316871s" podCreationTimestamp="2026-04-22 18:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:40:14.173129774 +0000 UTC m=+252.494898174" watchObservedRunningTime="2026-04-22 18:40:14.176316871 +0000 UTC m=+252.498085269" Apr 22 18:40:15.152973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:15.152934 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjdzq" event={"ID":"f01e5c64-eadd-49f5-a2f4-4953111daa69","Type":"ContainerStarted","Data":"9da39f59d632be304d4ade6f1f7f33c8887b5046ff69af5778c2b00db384d224"} Apr 22 18:40:15.152973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:15.152971 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vjdzq" event={"ID":"f01e5c64-eadd-49f5-a2f4-4953111daa69","Type":"ContainerStarted","Data":"cdf1bf2ceaa77e5fa5152bf2148568dcaba6d3419e131a470726ca8ff357926f"} Apr 22 18:40:15.154653 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:15.154628 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" event={"ID":"d88c4973-43ea-4185-bfdc-20f723bdc89b","Type":"ContainerStarted","Data":"495b6daafcd36013cb37f4f74956cd23d4b72c657d2561b673f59e18364a9567"} Apr 22 18:40:15.170953 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:15.170907 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vjdzq" podStartSLOduration=251.856929638 podStartE2EDuration="4m13.170892955s" podCreationTimestamp="2026-04-22 18:36:02 +0000 UTC" firstStartedPulling="2026-04-22 18:40:13.309587769 +0000 UTC m=+251.631356160" lastFinishedPulling="2026-04-22 18:40:14.623551087 +0000 UTC m=+252.945319477" observedRunningTime="2026-04-22 18:40:15.168634444 +0000 UTC m=+253.490402842" watchObservedRunningTime="2026-04-22 18:40:15.170892955 +0000 UTC m=+253.492661352" Apr 22 18:40:15.191808 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:15.191756 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-69bd9b55df-dj8wt" podStartSLOduration=2.172244043 podStartE2EDuration="4.191739014s" podCreationTimestamp="2026-04-22 18:40:11 +0000 UTC" firstStartedPulling="2026-04-22 18:40:11.985274138 +0000 UTC m=+250.307042514" lastFinishedPulling="2026-04-22 18:40:14.004769104 +0000 UTC m=+252.326537485" observedRunningTime="2026-04-22 18:40:15.190948639 +0000 UTC m=+253.512717038" watchObservedRunningTime="2026-04-22 18:40:15.191739014 +0000 UTC m=+253.513507413" Apr 22 18:40:17.502026 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:40:17.501990 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:02.129835 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:41:02.129806 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:41:02.132006 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:41:02.131981 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:41:02.134234 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:41:02.134212 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 18:41:12.501746 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:41:12.501711 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:12.516473 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:41:12.516453 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:41:13.339663 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:41:13.339635 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 22 18:45:29.852741 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.852703 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-sv5jm"] Apr 22 18:45:29.855871 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.855852 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv5jm" Apr 22 18:45:29.858465 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.858434 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:45:29.858604 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.858446 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:45:29.859364 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.859348 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:45:29.859431 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.859351 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-8mrsg\"" Apr 22 18:45:29.863791 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.863772 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sv5jm"] Apr 22 18:45:29.877189 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.877169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8nqv\" (UniqueName: \"kubernetes.io/projected/595d5550-5548-4c8d-b5af-b4a454956d51-kube-api-access-c8nqv\") pod \"s3-init-sv5jm\" (UID: \"595d5550-5548-4c8d-b5af-b4a454956d51\") " pod="kserve/s3-init-sv5jm" Apr 22 18:45:29.978553 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.978510 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8nqv\" (UniqueName: \"kubernetes.io/projected/595d5550-5548-4c8d-b5af-b4a454956d51-kube-api-access-c8nqv\") pod \"s3-init-sv5jm\" (UID: \"595d5550-5548-4c8d-b5af-b4a454956d51\") " pod="kserve/s3-init-sv5jm" Apr 22 18:45:29.986836 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:29.986804 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8nqv\" (UniqueName: \"kubernetes.io/projected/595d5550-5548-4c8d-b5af-b4a454956d51-kube-api-access-c8nqv\") pod \"s3-init-sv5jm\" (UID: \"595d5550-5548-4c8d-b5af-b4a454956d51\") " pod="kserve/s3-init-sv5jm" Apr 22 18:45:30.179387 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:30.179314 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv5jm" Apr 22 18:45:30.297561 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:30.297532 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-sv5jm"] Apr 22 18:45:30.300525 ip-10-0-131-85 kubenswrapper[2571]: W0422 18:45:30.300501 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod595d5550_5548_4c8d_b5af_b4a454956d51.slice/crio-9a24bcc4c03c189b14146e3e4f5ff9dd1e1850e16a5c7d86d012fb38dca97e0d WatchSource:0}: Error finding container 9a24bcc4c03c189b14146e3e4f5ff9dd1e1850e16a5c7d86d012fb38dca97e0d: Status 404 returned error can't find the container with id 9a24bcc4c03c189b14146e3e4f5ff9dd1e1850e16a5c7d86d012fb38dca97e0d Apr 22 18:45:30.302229 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:30.302212 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:45:31.076764 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:31.076719 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv5jm" event={"ID":"595d5550-5548-4c8d-b5af-b4a454956d51","Type":"ContainerStarted","Data":"9a24bcc4c03c189b14146e3e4f5ff9dd1e1850e16a5c7d86d012fb38dca97e0d"} Apr 22 18:45:35.090651 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:35.090609 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv5jm" event={"ID":"595d5550-5548-4c8d-b5af-b4a454956d51","Type":"ContainerStarted","Data":"797631d51eaf6de6aa195c7f0c32ee21a7c9117ec70cfc1c8ade492de1c95aaa"} Apr 22 18:45:35.106992 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:35.106940 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-sv5jm" podStartSLOduration=1.659741947 podStartE2EDuration="6.106924658s" podCreationTimestamp="2026-04-22 18:45:29 +0000 UTC" firstStartedPulling="2026-04-22 18:45:30.302336976 +0000 UTC m=+568.624105356" lastFinishedPulling="2026-04-22 18:45:34.749519692 +0000 UTC m=+573.071288067" observedRunningTime="2026-04-22 18:45:35.105856821 +0000 UTC m=+573.427625212" watchObservedRunningTime="2026-04-22 18:45:35.106924658 +0000 UTC m=+573.428693058" Apr 22 18:45:38.100344 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:38.100308 2571 generic.go:358] "Generic (PLEG): container finished" podID="595d5550-5548-4c8d-b5af-b4a454956d51" containerID="797631d51eaf6de6aa195c7f0c32ee21a7c9117ec70cfc1c8ade492de1c95aaa" exitCode=0 Apr 22 18:45:38.100749 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:38.100388 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv5jm" event={"ID":"595d5550-5548-4c8d-b5af-b4a454956d51","Type":"ContainerDied","Data":"797631d51eaf6de6aa195c7f0c32ee21a7c9117ec70cfc1c8ade492de1c95aaa"} Apr 22 18:45:39.229253 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:39.229227 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv5jm" Apr 22 18:45:39.259540 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:39.259514 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8nqv\" (UniqueName: \"kubernetes.io/projected/595d5550-5548-4c8d-b5af-b4a454956d51-kube-api-access-c8nqv\") pod \"595d5550-5548-4c8d-b5af-b4a454956d51\" (UID: \"595d5550-5548-4c8d-b5af-b4a454956d51\") " Apr 22 18:45:39.261579 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:39.261550 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/595d5550-5548-4c8d-b5af-b4a454956d51-kube-api-access-c8nqv" (OuterVolumeSpecName: "kube-api-access-c8nqv") pod "595d5550-5548-4c8d-b5af-b4a454956d51" (UID: "595d5550-5548-4c8d-b5af-b4a454956d51"). InnerVolumeSpecName "kube-api-access-c8nqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:45:39.360700 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:39.360619 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c8nqv\" (UniqueName: \"kubernetes.io/projected/595d5550-5548-4c8d-b5af-b4a454956d51-kube-api-access-c8nqv\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 18:45:40.109617 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:40.109572 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-sv5jm" event={"ID":"595d5550-5548-4c8d-b5af-b4a454956d51","Type":"ContainerDied","Data":"9a24bcc4c03c189b14146e3e4f5ff9dd1e1850e16a5c7d86d012fb38dca97e0d"} Apr 22 18:45:40.109617 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:40.109603 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-sv5jm" Apr 22 18:45:40.109617 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:45:40.109613 2571 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a24bcc4c03c189b14146e3e4f5ff9dd1e1850e16a5c7d86d012fb38dca97e0d" Apr 22 18:46:02.157836 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:46:02.157805 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:46:02.158389 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:46:02.157805 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:51:02.185973 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:51:02.185903 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:51:02.186632 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:51:02.186607 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:56:02.208503 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:56:02.208475 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 18:56:02.210918 ip-10-0-131-85 kubenswrapper[2571]: I0422 18:56:02.209072 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:01:02.238447 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:01:02.238330 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:01:02.242573 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:01:02.238719 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:06:02.270215 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:06:02.270188 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:06:02.272402 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:06:02.271717 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:11:02.292469 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:11:02.292343 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:11:02.296346 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:11:02.294242 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:16:02.317786 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:16:02.317686 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:16:02.321979 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:16:02.320940 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:21:02.340899 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:21:02.340787 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:21:02.349112 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:21:02.349077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:26:02.367637 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:26:02.367508 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:26:02.376667 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:26:02.376648 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:31:02.395965 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:31:02.395850 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:31:02.405760 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:31:02.405733 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:36:02.418184 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:36:02.418057 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:36:02.429786 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:36:02.429766 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:41:02.439982 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:41:02.439862 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:41:02.453173 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:41:02.453149 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:45:52.951187 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.951154 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m7lb5/must-gather-8hzbs"] Apr 22 19:45:52.951666 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.951494 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="595d5550-5548-4c8d-b5af-b4a454956d51" containerName="s3-init" Apr 22 19:45:52.951666 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.951505 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="595d5550-5548-4c8d-b5af-b4a454956d51" containerName="s3-init" Apr 22 19:45:52.951666 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.951562 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="595d5550-5548-4c8d-b5af-b4a454956d51" containerName="s3-init" Apr 22 19:45:52.954645 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.954630 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:52.957127 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.957078 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m7lb5\"/\"openshift-service-ca.crt\"" Apr 22 19:45:52.957127 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.957078 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-m7lb5\"/\"default-dockercfg-8knz8\"" Apr 22 19:45:52.957303 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.957172 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m7lb5\"/\"kube-root-ca.crt\"" Apr 22 19:45:52.969833 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:52.969806 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m7lb5/must-gather-8hzbs"] Apr 22 19:45:53.082848 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.082816 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/890aa654-ffc3-4b25-bbcb-22f439774551-must-gather-output\") pod \"must-gather-8hzbs\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:53.083032 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.082866 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgxcc\" (UniqueName: \"kubernetes.io/projected/890aa654-ffc3-4b25-bbcb-22f439774551-kube-api-access-tgxcc\") pod \"must-gather-8hzbs\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:53.184345 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.184303 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/890aa654-ffc3-4b25-bbcb-22f439774551-must-gather-output\") pod \"must-gather-8hzbs\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:53.184515 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.184359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tgxcc\" (UniqueName: \"kubernetes.io/projected/890aa654-ffc3-4b25-bbcb-22f439774551-kube-api-access-tgxcc\") pod \"must-gather-8hzbs\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:53.184677 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.184656 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/890aa654-ffc3-4b25-bbcb-22f439774551-must-gather-output\") pod \"must-gather-8hzbs\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:53.191941 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.191918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgxcc\" (UniqueName: \"kubernetes.io/projected/890aa654-ffc3-4b25-bbcb-22f439774551-kube-api-access-tgxcc\") pod \"must-gather-8hzbs\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:53.272844 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.272812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:45:53.394929 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.394905 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m7lb5/must-gather-8hzbs"] Apr 22 19:45:53.397438 ip-10-0-131-85 kubenswrapper[2571]: W0422 19:45:53.397406 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod890aa654_ffc3_4b25_bbcb_22f439774551.slice/crio-0fd2fa5900f09d90ac432d132dedb071a4ae80024b692ee346cfe6c4c22ffc51 WatchSource:0}: Error finding container 0fd2fa5900f09d90ac432d132dedb071a4ae80024b692ee346cfe6c4c22ffc51: Status 404 returned error can't find the container with id 0fd2fa5900f09d90ac432d132dedb071a4ae80024b692ee346cfe6c4c22ffc51 Apr 22 19:45:53.399451 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.399435 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:45:53.637865 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:53.637783 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" event={"ID":"890aa654-ffc3-4b25-bbcb-22f439774551","Type":"ContainerStarted","Data":"0fd2fa5900f09d90ac432d132dedb071a4ae80024b692ee346cfe6c4c22ffc51"} Apr 22 19:45:58.657389 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:58.657353 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" event={"ID":"890aa654-ffc3-4b25-bbcb-22f439774551","Type":"ContainerStarted","Data":"d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6"} Apr 22 19:45:58.657389 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:58.657389 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" event={"ID":"890aa654-ffc3-4b25-bbcb-22f439774551","Type":"ContainerStarted","Data":"df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed"} Apr 22 19:45:58.673979 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:45:58.673918 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" podStartSLOduration=2.4744180780000002 podStartE2EDuration="6.673898132s" podCreationTimestamp="2026-04-22 19:45:52 +0000 UTC" firstStartedPulling="2026-04-22 19:45:53.399562969 +0000 UTC m=+4191.721331345" lastFinishedPulling="2026-04-22 19:45:57.599043022 +0000 UTC m=+4195.920811399" observedRunningTime="2026-04-22 19:45:58.672201957 +0000 UTC m=+4196.993970357" watchObservedRunningTime="2026-04-22 19:45:58.673898132 +0000 UTC m=+4196.995666528" Apr 22 19:46:02.477175 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:02.477045 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:46:02.484247 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:02.484230 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:46:22.736393 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:22.736359 2571 generic.go:358] "Generic (PLEG): container finished" podID="890aa654-ffc3-4b25-bbcb-22f439774551" containerID="df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed" exitCode=0 Apr 22 19:46:22.736797 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:22.736434 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" event={"ID":"890aa654-ffc3-4b25-bbcb-22f439774551","Type":"ContainerDied","Data":"df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed"} Apr 22 19:46:22.736797 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:22.736779 2571 scope.go:117] "RemoveContainer" containerID="df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed" Apr 22 19:46:23.029818 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.029785 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m7lb5_must-gather-8hzbs_890aa654-ffc3-4b25-bbcb-22f439774551/gather/0.log" Apr 22 19:46:23.517195 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.517154 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5hs9k/must-gather-ssfrv"] Apr 22 19:46:23.520712 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.520687 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.523042 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.523015 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5hs9k\"/\"openshift-service-ca.crt\"" Apr 22 19:46:23.523153 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.523072 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5hs9k\"/\"default-dockercfg-9t4h8\"" Apr 22 19:46:23.523153 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.523081 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5hs9k\"/\"kube-root-ca.crt\"" Apr 22 19:46:23.527027 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.527004 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/must-gather-ssfrv"] Apr 22 19:46:23.554213 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.554174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z28wj\" (UniqueName: \"kubernetes.io/projected/cdfaea1d-5c8e-427f-9d82-44f368fb7a86-kube-api-access-z28wj\") pod \"must-gather-ssfrv\" (UID: \"cdfaea1d-5c8e-427f-9d82-44f368fb7a86\") " pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.554397 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.554244 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cdfaea1d-5c8e-427f-9d82-44f368fb7a86-must-gather-output\") pod \"must-gather-ssfrv\" (UID: \"cdfaea1d-5c8e-427f-9d82-44f368fb7a86\") " pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.655684 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.655625 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z28wj\" (UniqueName: \"kubernetes.io/projected/cdfaea1d-5c8e-427f-9d82-44f368fb7a86-kube-api-access-z28wj\") pod \"must-gather-ssfrv\" (UID: \"cdfaea1d-5c8e-427f-9d82-44f368fb7a86\") " pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.655866 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.655725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cdfaea1d-5c8e-427f-9d82-44f368fb7a86-must-gather-output\") pod \"must-gather-ssfrv\" (UID: \"cdfaea1d-5c8e-427f-9d82-44f368fb7a86\") " pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.656114 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.656069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cdfaea1d-5c8e-427f-9d82-44f368fb7a86-must-gather-output\") pod \"must-gather-ssfrv\" (UID: \"cdfaea1d-5c8e-427f-9d82-44f368fb7a86\") " pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.663134 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.663114 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z28wj\" (UniqueName: \"kubernetes.io/projected/cdfaea1d-5c8e-427f-9d82-44f368fb7a86-kube-api-access-z28wj\") pod \"must-gather-ssfrv\" (UID: \"cdfaea1d-5c8e-427f-9d82-44f368fb7a86\") " pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.830773 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.830689 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/must-gather-ssfrv" Apr 22 19:46:23.952483 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:23.952452 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/must-gather-ssfrv"] Apr 22 19:46:23.955937 ip-10-0-131-85 kubenswrapper[2571]: W0422 19:46:23.955908 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdfaea1d_5c8e_427f_9d82_44f368fb7a86.slice/crio-097e70a1c880ab850a6731305798196cec3a2e628f2782bad74cfbb03a8d8a98 WatchSource:0}: Error finding container 097e70a1c880ab850a6731305798196cec3a2e628f2782bad74cfbb03a8d8a98: Status 404 returned error can't find the container with id 097e70a1c880ab850a6731305798196cec3a2e628f2782bad74cfbb03a8d8a98 Apr 22 19:46:24.743780 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:24.743737 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/must-gather-ssfrv" event={"ID":"cdfaea1d-5c8e-427f-9d82-44f368fb7a86","Type":"ContainerStarted","Data":"097e70a1c880ab850a6731305798196cec3a2e628f2782bad74cfbb03a8d8a98"} Apr 22 19:46:25.748802 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:25.748515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/must-gather-ssfrv" event={"ID":"cdfaea1d-5c8e-427f-9d82-44f368fb7a86","Type":"ContainerStarted","Data":"9a13ab771693f1dfecf66b05f583a83e1da80a119863e4c08fb491e653a42a83"} Apr 22 19:46:25.748802 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:25.748558 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/must-gather-ssfrv" event={"ID":"cdfaea1d-5c8e-427f-9d82-44f368fb7a86","Type":"ContainerStarted","Data":"75307345f6cae18bcec5524d39e9a5f8162ab87e597af514bc2987ca0ffaef11"} Apr 22 19:46:25.765619 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:25.765562 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5hs9k/must-gather-ssfrv" podStartSLOduration=1.846222971 podStartE2EDuration="2.765541964s" podCreationTimestamp="2026-04-22 19:46:23 +0000 UTC" firstStartedPulling="2026-04-22 19:46:23.957580598 +0000 UTC m=+4222.279348976" lastFinishedPulling="2026-04-22 19:46:24.876899586 +0000 UTC m=+4223.198667969" observedRunningTime="2026-04-22 19:46:25.764077146 +0000 UTC m=+4224.085845546" watchObservedRunningTime="2026-04-22 19:46:25.765541964 +0000 UTC m=+4224.087310363" Apr 22 19:46:26.316981 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:26.316941 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-n59sp_4e7619c0-c870-4eac-9372-f539d9128cc8/global-pull-secret-syncer/0.log" Apr 22 19:46:26.367167 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:26.367116 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hcck9_34348a50-b524-4eb0-80d8-2866eaf0b1aa/konnectivity-agent/0.log" Apr 22 19:46:26.498420 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:26.498393 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-85.ec2.internal_cfa5d89adc8adc45f4b9cd558035eee4/haproxy/0.log" Apr 22 19:46:28.371321 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.371270 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m7lb5/must-gather-8hzbs"] Apr 22 19:46:28.374136 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.374102 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" containerName="copy" containerID="cri-o://d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6" gracePeriod=2 Apr 22 19:46:28.376356 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.376298 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m7lb5/must-gather-8hzbs"] Apr 22 19:46:28.378329 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.378291 2571 status_manager.go:895] "Failed to get status for pod" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" err="pods \"must-gather-8hzbs\" is forbidden: User \"system:node:ip-10-0-131-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7lb5\": no relationship found between node 'ip-10-0-131-85.ec2.internal' and this object" Apr 22 19:46:28.744865 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.742837 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m7lb5_must-gather-8hzbs_890aa654-ffc3-4b25-bbcb-22f439774551/copy/0.log" Apr 22 19:46:28.744865 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.743277 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:46:28.747108 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.746410 2571 status_manager.go:895] "Failed to get status for pod" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" err="pods \"must-gather-8hzbs\" is forbidden: User \"system:node:ip-10-0-131-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7lb5\": no relationship found between node 'ip-10-0-131-85.ec2.internal' and this object" Apr 22 19:46:28.762108 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.761980 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m7lb5_must-gather-8hzbs_890aa654-ffc3-4b25-bbcb-22f439774551/copy/0.log" Apr 22 19:46:28.763797 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.762404 2571 generic.go:358] "Generic (PLEG): container finished" podID="890aa654-ffc3-4b25-bbcb-22f439774551" containerID="d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6" exitCode=143 Apr 22 19:46:28.763797 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.762510 2571 scope.go:117] "RemoveContainer" containerID="d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6" Apr 22 19:46:28.763797 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.762636 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" Apr 22 19:46:28.764861 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.764821 2571 status_manager.go:895] "Failed to get status for pod" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" err="pods \"must-gather-8hzbs\" is forbidden: User \"system:node:ip-10-0-131-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7lb5\": no relationship found between node 'ip-10-0-131-85.ec2.internal' and this object" Apr 22 19:46:28.777295 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.776243 2571 scope.go:117] "RemoveContainer" containerID="df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed" Apr 22 19:46:28.798732 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.796799 2571 scope.go:117] "RemoveContainer" containerID="d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6" Apr 22 19:46:28.798732 ip-10-0-131-85 kubenswrapper[2571]: E0422 19:46:28.797293 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6\": container with ID starting with d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6 not found: ID does not exist" containerID="d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6" Apr 22 19:46:28.798732 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.797333 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6"} err="failed to get container status \"d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6\": rpc error: code = NotFound desc = could not find container \"d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6\": container with ID starting with d74fbc933ae876f78fb72a145e1d0c0dc66a605d3ca69b9c1a211ba2d4c4b7c6 not found: ID does not exist" Apr 22 19:46:28.798732 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.797357 2571 scope.go:117] "RemoveContainer" containerID="df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed" Apr 22 19:46:28.798732 ip-10-0-131-85 kubenswrapper[2571]: E0422 19:46:28.797650 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed\": container with ID starting with df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed not found: ID does not exist" containerID="df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed" Apr 22 19:46:28.798732 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.797676 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed"} err="failed to get container status \"df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed\": rpc error: code = NotFound desc = could not find container \"df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed\": container with ID starting with df32b9ad3c5aef0c66815bc74db48432dd6e44f13ba9bf3e36b2b40cb83019ed not found: ID does not exist" Apr 22 19:46:28.803075 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.802144 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/890aa654-ffc3-4b25-bbcb-22f439774551-must-gather-output\") pod \"890aa654-ffc3-4b25-bbcb-22f439774551\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " Apr 22 19:46:28.803075 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.802286 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgxcc\" (UniqueName: \"kubernetes.io/projected/890aa654-ffc3-4b25-bbcb-22f439774551-kube-api-access-tgxcc\") pod \"890aa654-ffc3-4b25-bbcb-22f439774551\" (UID: \"890aa654-ffc3-4b25-bbcb-22f439774551\") " Apr 22 19:46:28.804206 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.804175 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/890aa654-ffc3-4b25-bbcb-22f439774551-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "890aa654-ffc3-4b25-bbcb-22f439774551" (UID: "890aa654-ffc3-4b25-bbcb-22f439774551"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:46:28.820324 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.819212 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890aa654-ffc3-4b25-bbcb-22f439774551-kube-api-access-tgxcc" (OuterVolumeSpecName: "kube-api-access-tgxcc") pod "890aa654-ffc3-4b25-bbcb-22f439774551" (UID: "890aa654-ffc3-4b25-bbcb-22f439774551"). InnerVolumeSpecName "kube-api-access-tgxcc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:46:28.903407 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.903362 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tgxcc\" (UniqueName: \"kubernetes.io/projected/890aa654-ffc3-4b25-bbcb-22f439774551-kube-api-access-tgxcc\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 19:46:28.903407 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:28.903413 2571 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/890aa654-ffc3-4b25-bbcb-22f439774551-must-gather-output\") on node \"ip-10-0-131-85.ec2.internal\" DevicePath \"\"" Apr 22 19:46:29.078793 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:29.078748 2571 status_manager.go:895] "Failed to get status for pod" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" pod="openshift-must-gather-m7lb5/must-gather-8hzbs" err="pods \"must-gather-8hzbs\" is forbidden: User \"system:node:ip-10-0-131-85.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7lb5\": no relationship found between node 'ip-10-0-131-85.ec2.internal' and this object" Apr 22 19:46:30.028518 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.028484 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4944fb20-7fca-449f-8e4d-85aed430bac0/alertmanager/0.log" Apr 22 19:46:30.053107 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.053059 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4944fb20-7fca-449f-8e4d-85aed430bac0/config-reloader/0.log" Apr 22 19:46:30.077200 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.077150 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4944fb20-7fca-449f-8e4d-85aed430bac0/kube-rbac-proxy-web/0.log" Apr 22 19:46:30.100701 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.100603 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4944fb20-7fca-449f-8e4d-85aed430bac0/kube-rbac-proxy/0.log" Apr 22 19:46:30.120902 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.120870 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4944fb20-7fca-449f-8e4d-85aed430bac0/kube-rbac-proxy-metric/0.log" Apr 22 19:46:30.142109 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.142064 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4944fb20-7fca-449f-8e4d-85aed430bac0/prom-label-proxy/0.log" Apr 22 19:46:30.167048 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.167014 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_4944fb20-7fca-449f-8e4d-85aed430bac0/init-config-reloader/0.log" Apr 22 19:46:30.209182 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.209146 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-kzgfb_ae20567a-fdd6-4700-8205-d7122697fdbb/cluster-monitoring-operator/0.log" Apr 22 19:46:30.236590 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.236557 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tr8d6_241a6dfc-3317-476a-87a7-c45566d85531/kube-state-metrics/0.log" Apr 22 19:46:30.256008 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.255970 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" path="/var/lib/kubelet/pods/890aa654-ffc3-4b25-bbcb-22f439774551/volumes" Apr 22 19:46:30.256952 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.256926 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tr8d6_241a6dfc-3317-476a-87a7-c45566d85531/kube-rbac-proxy-main/0.log" Apr 22 19:46:30.285682 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.285654 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-tr8d6_241a6dfc-3317-476a-87a7-c45566d85531/kube-rbac-proxy-self/0.log" Apr 22 19:46:30.316015 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.315987 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-86bf4ff869-c8r6f_55ee5af3-880b-43e0-9488-372ee8e23ae3/metrics-server/0.log" Apr 22 19:46:30.339842 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.339811 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-cvth6_9ce92ab3-5791-4660-bb4e-4740f84eb7d5/monitoring-plugin/0.log" Apr 22 19:46:30.511331 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.511299 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s5kds_f7e2838f-4cf4-4061-b415-1736faeee81b/node-exporter/0.log" Apr 22 19:46:30.535077 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.535045 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s5kds_f7e2838f-4cf4-4061-b415-1736faeee81b/kube-rbac-proxy/0.log" Apr 22 19:46:30.556386 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.556357 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-s5kds_f7e2838f-4cf4-4061-b415-1736faeee81b/init-textfile/0.log" Apr 22 19:46:30.587351 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.587316 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qw5j9_711d7979-e87b-47c3-9f49-79736b091f74/kube-rbac-proxy-main/0.log" Apr 22 19:46:30.608797 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.608767 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qw5j9_711d7979-e87b-47c3-9f49-79736b091f74/kube-rbac-proxy-self/0.log" Apr 22 19:46:30.632935 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.632906 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-qw5j9_711d7979-e87b-47c3-9f49-79736b091f74/openshift-state-metrics/0.log" Apr 22 19:46:30.677028 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.676999 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d294972f-269b-49b1-9403-a420af0aac12/prometheus/0.log" Apr 22 19:46:30.694492 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.694458 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d294972f-269b-49b1-9403-a420af0aac12/config-reloader/0.log" Apr 22 19:46:30.722982 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.722953 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d294972f-269b-49b1-9403-a420af0aac12/thanos-sidecar/0.log" Apr 22 19:46:30.742297 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.742266 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d294972f-269b-49b1-9403-a420af0aac12/kube-rbac-proxy-web/0.log" Apr 22 19:46:30.763275 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.763168 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d294972f-269b-49b1-9403-a420af0aac12/kube-rbac-proxy/0.log" Apr 22 19:46:30.783013 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.782979 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d294972f-269b-49b1-9403-a420af0aac12/kube-rbac-proxy-thanos/0.log" Apr 22 19:46:30.803105 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.803060 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_d294972f-269b-49b1-9403-a420af0aac12/init-config-reloader/0.log" Apr 22 19:46:30.908248 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.908201 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69bd9b55df-dj8wt_d88c4973-43ea-4185-bfdc-20f723bdc89b/telemeter-client/0.log" Apr 22 19:46:30.930566 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.930536 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69bd9b55df-dj8wt_d88c4973-43ea-4185-bfdc-20f723bdc89b/reload/0.log" Apr 22 19:46:30.950801 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:30.950729 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-69bd9b55df-dj8wt_d88c4973-43ea-4185-bfdc-20f723bdc89b/kube-rbac-proxy/0.log" Apr 22 19:46:33.037830 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.037786 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-t9zsh_6d29fb6d-5501-456e-8bb6-c5b983b92684/download-server/0.log" Apr 22 19:46:33.329812 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.329721 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h"] Apr 22 19:46:33.330528 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.330498 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" containerName="copy" Apr 22 19:46:33.330528 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.330528 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" containerName="copy" Apr 22 19:46:33.330700 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.330548 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" containerName="gather" Apr 22 19:46:33.330700 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.330561 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" containerName="gather" Apr 22 19:46:33.330700 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.330664 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" containerName="gather" Apr 22 19:46:33.330700 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.330678 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="890aa654-ffc3-4b25-bbcb-22f439774551" containerName="copy" Apr 22 19:46:33.335707 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.335684 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.340737 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.340705 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h"] Apr 22 19:46:33.447572 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.447538 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-proc\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.447748 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.447589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-lib-modules\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.447748 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.447705 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-sys\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.447748 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.447740 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j9mg\" (UniqueName: \"kubernetes.io/projected/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-kube-api-access-2j9mg\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.447899 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.447817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-podres\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549308 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-podres\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549494 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549332 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-proc\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549494 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549373 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-lib-modules\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549494 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-sys\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549494 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549481 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2j9mg\" (UniqueName: \"kubernetes.io/projected/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-kube-api-access-2j9mg\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549494 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549483 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-proc\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549701 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549479 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-podres\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549701 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549525 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-sys\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.549701 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.549600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-lib-modules\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.558526 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.558481 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j9mg\" (UniqueName: \"kubernetes.io/projected/7b0cf4bf-651c-47db-abf7-2a723e49d2ad-kube-api-access-2j9mg\") pod \"perf-node-gather-daemonset-s9r5h\" (UID: \"7b0cf4bf-651c-47db-abf7-2a723e49d2ad\") " pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.651796 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.651722 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:33.802300 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:33.801423 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h"] Apr 22 19:46:34.161516 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.161476 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8lhzb_3dbfdfa2-adbc-427e-8859-26bcaa36a0a7/dns/0.log" Apr 22 19:46:34.193645 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.193616 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-8lhzb_3dbfdfa2-adbc-427e-8859-26bcaa36a0a7/kube-rbac-proxy/0.log" Apr 22 19:46:34.349952 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.349925 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-gxtfk_f75b87a2-8899-4b74-9e48-0ca63be22b47/dns-node-resolver/0.log" Apr 22 19:46:34.788450 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.788418 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" event={"ID":"7b0cf4bf-651c-47db-abf7-2a723e49d2ad","Type":"ContainerStarted","Data":"69a5a81ce2360e63ad4eab50489c61cb1ed0aa6cb4132c42e7634a5435f6e674"} Apr 22 19:46:34.788450 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.788453 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" event={"ID":"7b0cf4bf-651c-47db-abf7-2a723e49d2ad","Type":"ContainerStarted","Data":"57c2153956f9c1f8b4717cad1e80db210ca5fdb70b0ea774f201d0a92cd8c744"} Apr 22 19:46:34.788695 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.788603 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:34.806546 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.806495 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" podStartSLOduration=1.8064813119999998 podStartE2EDuration="1.806481312s" podCreationTimestamp="2026-04-22 19:46:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:46:34.804576987 +0000 UTC m=+4233.126345408" watchObservedRunningTime="2026-04-22 19:46:34.806481312 +0000 UTC m=+4233.128249759" Apr 22 19:46:34.831489 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:34.831465 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-djp9s_4717a966-da61-4170-b33d-9c683e74d3aa/node-ca/0.log" Apr 22 19:46:35.863313 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:35.863285 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jcxbc_8a27a543-b3c5-436c-8326-abb0c703e4d0/serve-healthcheck-canary/0.log" Apr 22 19:46:36.379420 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:36.379388 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j9zwr_8e21f1d0-e415-4dcb-b33c-327299da2218/kube-rbac-proxy/0.log" Apr 22 19:46:36.398552 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:36.398525 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j9zwr_8e21f1d0-e415-4dcb-b33c-327299da2218/exporter/0.log" Apr 22 19:46:36.417653 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:36.417628 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j9zwr_8e21f1d0-e415-4dcb-b33c-327299da2218/extractor/0.log" Apr 22 19:46:38.812550 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:38.812524 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-sv5jm_595d5550-5548-4c8d-b5af-b4a454956d51/s3-init/0.log" Apr 22 19:46:40.803844 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:40.803812 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-5hs9k/perf-node-gather-daemonset-s9r5h" Apr 22 19:46:42.438631 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:42.438544 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-88ztk_c415df20-fd9e-4f8b-9365-cf896cd78be4/migrator/0.log" Apr 22 19:46:42.463288 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:42.463253 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-88ztk_c415df20-fd9e-4f8b-9365-cf896cd78be4/graceful-termination/0.log" Apr 22 19:46:43.709497 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:43.709419 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dcm7_a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79/kube-multus-additional-cni-plugins/0.log" Apr 22 19:46:43.730798 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:43.730767 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dcm7_a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79/egress-router-binary-copy/0.log" Apr 22 19:46:43.753320 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:43.753291 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dcm7_a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79/cni-plugins/0.log" Apr 22 19:46:43.774973 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:43.774947 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dcm7_a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79/bond-cni-plugin/0.log" Apr 22 19:46:43.797290 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:43.797267 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dcm7_a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79/routeoverride-cni/0.log" Apr 22 19:46:43.819881 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:43.819860 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dcm7_a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79/whereabouts-cni-bincopy/0.log" Apr 22 19:46:43.839978 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:43.839950 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4dcm7_a4a9c2e5-46e7-46c2-8cba-ab6a8dce5c79/whereabouts-cni/0.log" Apr 22 19:46:44.182031 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:44.181996 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-f8jk4_f9c02c13-ac97-4da1-8f21-e45794600da6/kube-multus/0.log" Apr 22 19:46:44.338802 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:44.338772 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vjdzq_f01e5c64-eadd-49f5-a2f4-4953111daa69/network-metrics-daemon/0.log" Apr 22 19:46:44.358876 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:44.358845 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-vjdzq_f01e5c64-eadd-49f5-a2f4-4953111daa69/kube-rbac-proxy/0.log" Apr 22 19:46:45.422063 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.422035 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-controller/0.log" Apr 22 19:46:45.441018 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.440987 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/0.log" Apr 22 19:46:45.462979 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.462934 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovn-acl-logging/1.log" Apr 22 19:46:45.483798 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.483761 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/kube-rbac-proxy-node/0.log" Apr 22 19:46:45.503144 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.503121 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:46:45.519778 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.519752 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/northd/0.log" Apr 22 19:46:45.539621 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.539593 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/nbdb/0.log" Apr 22 19:46:45.563566 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.563542 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/sbdb/0.log" Apr 22 19:46:45.671381 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:45.671340 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rx4st_4d26e750-5c11-4023-9012-7dd824eeda4f/ovnkube-controller/0.log" Apr 22 19:46:46.957329 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:46.957283 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-452lr_db99ee19-11b0-4246-bee1-d19d9ac8abc1/check-endpoints/0.log" Apr 22 19:46:47.043593 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:47.043558 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-lllsm_42281d48-d81c-48a1-ac06-753e55e4ae05/network-check-target-container/0.log" Apr 22 19:46:47.996444 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:47.996400 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qgkvh_84ddb6a8-3bdd-4890-a3d5-b36eeb73829a/iptables-alerter/0.log" Apr 22 19:46:48.621902 ip-10-0-131-85 kubenswrapper[2571]: I0422 19:46:48.621872 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-wknrq_b04ddd7d-9290-4198-9b08-9617306b7172/tuned/0.log"